🧠 Remember everything. (very alpha - download anyway)
I would love to keep this project alive and growing, but can't do it alone.
If you're at all interested in contributing, please feel free to reach out, start a discussion, open a PR, look at issues, look at roadmap below, etc.
Something not working properly? There's no telemtry or tracking, so I won't know! Please log an issue or take a crack at fixing it yourself and submitting a PR! Have feature ideas? Log an issue!
Want to learn more about the code?
Here's the Generated Wiki
An open source approach to locally record everything you view on your Mac (prefer other platforms? come help build xrem, cross-platform version of this project).
_Note: Only tested on Apple Silicon, but there is now an intel build
Please log any bugs / issues you find!
Looking at this code and grimacing? Want to help turn this project into something awesome? Please contribute. I haven't written Swift since 2017. I'm sure you'll write better code than me.
I think the idea of recording everything you see has the potential to change how we interact with our computers, and believe it should be open source.
Also, from a privacy / security perspective, this is like... pretty scary stuff, and I want the code open so we know for certain that nothing is leaving your laptop. Even telemetry has the potential to leak private info.
This is 100% local. Please, read the code yourself.
Also, that means there is no tracking / analytics of any kind, which means I don't know you're running into bugs when you do. So please report any / all you find!
- Automatically take a screenshot every 2 seconds, recognizing all text, using an efficient approach in terms of space and energy
- Go back in time (full-screen scrubber of everything you've viewed)
- Copy text from back in time
- Search everything you've viewed with keyword search (and filter by application)
- Easily grab recent context for use with LLMs
- Intel build (please help test!)
- It "works" with external / multiple monitors connected
- Natural language search / agent interaction via updating local vector embedding
- Novel search experiences like spatial / similar images
- More search filters (by time, etc.)
- Fine-grained purging / trimming / selecting recording
- Better / First-class multi-monitor support
- Download the latest release, or build it yourself!
- Launch the app
- Click the brain
- Click "Start Remembering"
- Grant it access to "Screen Recording" i.e. take screenshots every 2 seconds
- Click "Open timeline" or "Cmd + Scroll Up" to open the timeline view
- Scroll left or right to move in time
- Click "Search" to open the search view
- Search your history and click on a thumbnail to go there in the timeline
- In timeline, give Live Text a second and then you can select text
- Click "Copy Recent Context" to grab a prompt for interacting with an LLM with what you've seen recently as context
- Click "Show Me My Data" to open a finder window where
rem
stores SQLite db + video recordings - Click "Purge All Data" to delete everything (useful if something breaks)
(that should be all that's needed)
- Clone the repo
git clone --recursive -j8 https://github.com/jasonjmcghee/rem.git
or rungit submodule update --init --recursive
after cloning - Open project in Xcode
- Product > Archive
- Distribute App
- Custom
- Copy App
- Where is my data?
- Click "Show Me My Data" in the tray / status icon menu
- Currently it is stored in:
~/Library/Containers/today.jason.rem/Data/Library/Application Support/today.jason.rem
- It was originally:
~/Library/Application\ Support/today.jason.rem/
- Wow that logo is so great, you're an artist. Can I see your figma?
- So nice of you to say, sure here it is