I tried reading through the Google newsgroup archives to see if maybe this question had been asked and answered already, but as soon as I got to an old post by Altavoz it became too painful to continue. So, here goes.
If you're running a motor load at the end of a length of wire, it seems easy enough to use a table or a program to determine the appropriate gauge of wire to use to keep the voltage drop within a desired range. But how do you calculate what to use if the wire gauges change midway through the run? Here's an example. Let's say you have a three horsepower, 240 volt, single phase motor at the end of 300 feet of 10 gauge wire. From what I have come up with when plugging those values into some of the online voltage drop calculators, that's pretty much the maximum you'd want to do in order to keep below a 5% voltage drop. But what if you have 200 feet of wire leading up to the point where that 300 foot run of 10 gauge begins? It's easy enough if you were using 10 gauge the whole way, but what if you had 200 feet of2 gauge, then 300 feet of 10 gauge? How does that change things?
The above example is taken from a real-world installation of a submerged well pump where the pump is 300 feet down, and there's a 200 foot run from the meter to the junction box for the well pump. I'm wondering because not only would it answer some questions about the efficiency of the well pump, but also because in the future there would be maybe a 200 foot run from the meter to a subpanel for a shop that would be running motors on lathes, mills, etc. Granted, there wouldn't be a 300 foot run of cable from a lathe/mill motor back to the subpanel, but I'm assuming the calculations would be done the same way, just using different values for distances. Any bones that could be thrown to a guy who was never very good at math would be much appreciated.