Is it just me or has there been a plethora of threads in the various WW forums lately with unrealistic expectations from tools? I'm talking about the "my (you name it) is out .0001 of an inch" or "I bought a (you name it) and it's out of true .0001 of an inch". I'm not sure how feeler gages and dial indicators got into the WW field, but a lot of folks need to step back and have a serious reality check
Folks, we're talking one TEN THOUSANDS of an inch! First off, I sincerely doubt most, if not all of the folks that are making these claims have the proper tools to measure .0001. Tools that can accurately measure to that tolerance are VERY expensive and take practice to use. Next, I sinsearly doubt most of these folks know the proper methods involved in measuring these tolerances. for goodness sake, at .0001 temperature can alter the size due to thermal expansion/contraction. Not to mention most, if not all of these folks have NO idea about machining tolerances which generally run in the +.003 -.003 range. Yes, there are exceptions, but that's a good middle of the road tolerance range for the average machine job.
Let's put this into prospective, a average piece of printer paper is .003 to .004. I just read a post on another forum from a guy complaining the fence on his jointer was out .00015". Yes, we're talking 15 TEN THOUSANDS of an INCH! We're working wood, folks, it moves, swells and shrinks more than that in a day! If we have to set up WW machines to the sub .0001 levels, then how did the masters ever manage to make anything, much less the exquisite master pieces they did without the benefit of a dial indicator??? Or maybe we've lost sight of the craftsmanship aspect and we're expecting our machines to do it for us?
Sorry this turned into a rant, but it just blows my little bitty mind.