Quote:
Originally Posted by RobF
This was a gel battery, about 12V power supply - astrobot suggested that should be given too
|
Thanks for the additional info. Some people like to run their mounts on ~15V (via an A/C adapter) instead of ~12V, which will mean a difference of around 20% in current draw (and hence a 20% difference in battery life calculations, if the wrong voltage was assumed).
---------
Regarding power vs gearing, I guess following on from the other thread (and interesting topic you raise!) ...
To my way of thinking, an imbalance on any axis (e.g. if "pushing up" - as some people like to set up their mount that way) or the stiffness of grease (temperature/age), or age of bearings or payload weight will increase mechanical load, and hence draw more power*, and is likely to be the cause of more variance in power draw than differences in mechanical efficiency of gearing (e.g. gears vs belt).
Although belt drives are theoretically more efficient than transfer (spur) gears (IIRC), the losses aren't dramatically different - using some remembered numbers for cars as an example, a belted CVT has mechanical efficiency around 95%, whereas a manual gearbox has efficiency around 90% (albeit with lots of assumptions about the range of input power and torque). I'd be academically interested if there are decent figures for efficiency of spur gears vs belts as might be found in a mount. I stumbled across these figures, but don't know if they're that reliable:
http://www.eng-tips.com/viewthread.cfm?qid=74637
This is also an interesting article, though it has only qualitative info on drive efficiency:
http://www.dfmengineering.com/news_t...e_gearing.html
In an EQ mount, the final worm drive is likely to be the most lossy mechanical component in either case, since worm drives (IIRC) typically have only ~70% efficiency, though it'll depend on the ratio.
For batteries, there are so many other variables, it'd be hard to say that shortened battery life would be caused primarily by gear adjustment (assuming they are not tightened to the point of binding). Temperature, initial charge and standing time, payload, balance set up, etc. Nevertheless, gear adjustment might have an effect. I'd be interested in a controlled experiment.
When I measured power draw on my EQ6 Pro, I had it carefully balanced with my typical payload of (at that time) 13kg. I suppose I should actually measure power draw with different payloads and a slight imbalance, to see if there's a noticeable difference. It'd be interesting to see if the practice matches the theory.
* It's been a long time since I did DC motor theory, but IIRC (a big assumption!), increasing mechanical load reduces back EMF at any given instant due to increased lag, which increases net voltage and current supplied to motor and hence increases motor power input and mechanical power output, which reduces lag ... or something like that. It's a pretty neat self-correcting system, if operated within its design limits, with the general relationship that increasing load increases power drawn.