annotatAR lo-fi augmented reality for the mobile web

Capstone Project Milestones

As part of this week’s set of assignments, I have revisited the project milestones for annotatAR this fall.


Beta-test at the NYC Internet Yami-Ichi

12 Sept 2015

  • Mobile site deployed on a live server
  • Displays tweet stream overlaid on getUserMedia element

Device sensing

30 Sept 2015

  • Geolocation-aware
  • Access device accelerometer data
  • Map tweet position to 3D model
  • Determine whether or not to compile to native app(s)

Aesthetic refinements on mobile

16 Oct 2015

  • Style tweets based on timestamp
  • Screenshot functionality
  • User testing of mobile

Desktop site minimum viable product

30 Oct 2015

  • Display video from event with tweets overlaid
  • User testing of desktop

Desktop site refinements

20 Nov 2015

  • Text analysis (some or all of the following):
    • Sentiment analysis
    • Keyword parsing
    • Spatial and/or color encoding
  • Scrub / navigate on time dimension

Platform scalability

04 Dec 2015

  • Github repo with description and instructions for deployment
  • Integrate with existing social media (share buttons)

Bonus features

11 Dec 2015

  • User interface for deploying new app for an event
  • Compile to native Android app (if necessary)
  • Multiple views of augmented reality:
    • Tweet text becomes lexograph of video stream
    • Select variety of encoding for sentiments or keywords