Quote:
Originally Posted by TrevorW
A further example of this is one of the first colour pictures shown was fuzzy, ,excuse given, was that the lens was covered with dust, wouldn't you think to make the lens self cleaning, especially in this type of environment.
|
From an engineering perspective, they did precisely the right thing.
The camera used was one of the wheel clearance cameras and it is fitted with
a removable transparent dust cover.
Consider the scenarios that planners faced -
The vehicle has just descended, one doesn't know if it it will be sitting on its side
or if it will have been damaged. Meantime, dust is still blowing everywhere from
the lander and will take time to settle. One could automate that the dust cover
be removed on landing, but the lens would then still get dusty from the dust cloud.
The rover has no direct line of sight path to Earth and the orbiters relaying data
will be going under the Martian horizon in a few seconds time. Furthermore, if the
vehicle is damaged or sitting on its side, it may be a major unrecoverable
mistake to deploy something mechanical like removing the lens cover until a full
assessment can be made by controllers on Earth as to the consequences of that
action.
In any case, it takes 14 minutes to transmit to Earth, say a minute for the controllers
to make a decision and send a command back, which then takes 14 minutes to
get back to Mars. But to no avail, because by then the orbiters have gone beneath
the horizon and can't receive the transmission to relay on anyway.
So what does one do? All one can do is to get off a low res thumbnail and
some telemetry as to the rover's physical orientation so that cool heads can
take the next step on the following orbiter pass or the next time the rover is
within line of site of Earth antennas.
Each move has to be made like a slow chess game.