#Eyeo '15 Gallery
- Visualization, by (@Klase)
- Dancing Characters, by @robstenzinger
#MozFest '14
A few of the ideas that came out of this session:
- frequency analysis --> generative ASCII art
- pre-analyze music, save performance for rendering
- trigger events with JSON
- visualize music then turn the visualization back into music
- waypoints in song change visualization style
- map particle system to text / lyrics
- new mobile interfaces
- adaptive phone background / lighting changes based on music around you
- translate closed captions to time data
- interactive web-wide jam session
- scrape lyrics - text renderer
- visual ways to display sound, beyond just peaks
- map visuals to stems
- Synesthesia
- new visualization styles
- live generative visuals
- find natural "break points" in spoken audio
A few participants shared projects that came out of this session:
- lyrics visualizer by @indefinit
- recorder by @markitics