Yanni Loukissas of Georgia Tech and I are pleased to publish the executable version of our Apollo Lunar Landing Visualization.
The Apollo 11 visualization draws together social and technical data from the 1969 moon landing in a dynamic 2D graphic. The horizontal axis is an interactive timeline. The vertical axis is divided into several sections, each corresponding to a data source. At the top, commentators are present in narratives from Digital Apollo and NASA technical debriefings. Just below are the members of ground control. The middle section is a log-scale graph stretching from Earth (~10E9 ft. away) to the Moon. Utterances from the landing CAPCOM, Duke, the command module pilot, Collins, the mission commander, Armstrong, and the lunar module pilot, Aldrin, are plotted on this graph. The graph is partially overlaid on a composite image of the lunar surface. Data from the Apollo computer systems, the DSKY (display/keyboard interface to the Apollo computer) and the AGC (Abort Guidance Computer) occupies the bottom of the visualization. Each circle on the graph represents an utterance by one member of the team or ground control, with the size of the circle proportional to the length of the utterance. Lines connecting subsequent utterances represent inquiries and responses between team members. Specific events are labeled, such as computer program changes and program alarms. During a real-time playback, the white line moves across the horizontal axis as audio plays, and the crew’s specific utterances are spelled out to the right. In sync with the human dialog, the AGC and DSKY display values and modes. In these dynamics, one can trace the trading of workload and authority during the critical final phases of landing, and how that workload was offloaded from the lunar module to Houston in response to the program alarms.
Make a robot friend with Adafruit’s CRICKIT – A Creative Robotics & Interactive Construction Kit. It’s an add-on to our popular Circuit Playground Express, FEATHER and other platforms to make and program robots with CircuitPython, MakeCode, and Arduino. Start controlling motors, servos, solenoids. You also get signal pins, capacitive touch sensors, a NeoPixel driver and amplified speaker output. It complements & extends your boards so you can still use all the goodies on the microcontroller, now you have a robotics playground as well.
Have an amazing project to share? Join the SHOW-AND-TELL every Wednesday night at 7:30pm ET on Google+ Hangouts.
Join us every Wednesday night at 8pm ET for Ask an Engineer!
Maker Business — Fewer startups, and other collateral damage from the 2018 tariffs
Wearables — Light as a Worbla feather
Electronics — Your job’s a joke, you’re broke, your semiconductor is DOA
Biohacking — The Heart Rates of the Hazda
Python for Microcontrollers — One year of CircuitPython weeklies!
No comments yet.
Sorry, the comment form is closed at this time.