While developing Ovvio, we're constantly trying to preserve and build on top the good parts of a physical notebook. To read more about our general approach to work management, head over to our blog post detailing Ovvio's fundamentals.
One of the true strengths of a physical notebook is that you first write, then easily reflect on what you wrote. The downside is, notes quickly add up and start requiring a lot of organizational effort to be truly effective for professional work management.
Yet, more than 80% of the professionals we interviewed, manage their work in notebooks. Either physical notebooks (yes, real pen and paper in 2022) or basic notebook apps that conceptually work like pen and paper.
With Ovvio we embrace a "Write First, Organize Later" mentality.
We combine the ease of creation of a notebook, but automate or defer all organization chores.
In Ovvio, notes are single click to create, which allows our users to quickly jot down their thoughts without thinking in advance where is the correct place for a note. We then automatically extract all tasks (and sub-tasks!) from all notes, and neatly organize them in a task management interface.
Organizing and finding stuff in Ovvio is therefore based on filtering and sorting rather than more traditional hierarchical folders. In fact, filters in Ovvio act more like navigation than true filters.
They are also everywhere. About half the buttons on screen at any given moment actually trigger filters behind the scenes. While really speeding up our users' work, this presents non-trivial engineering challenges.
Let's go through our stages of building a work management solution with truly unique filtering capabilities:
Our early versions of Ovvio implemented filters using a modern server side approach. When a user interacts with a filter, the app builds the appropriate query and sends it to the server. The server then executes the query on behalf of the app, and returns the results to the app.
While worked, this approach resulted in an extremely slow user experience. Whenever you clicked any filter you'd have to wait in front a progress bar of some sort. And if you had poor network connection then it would simply not work due to timeouts. This is unacceptable for our users which constantly move around with bad network.
Not only that, but this approach fundamentally conflicts with real time collaboration. Whenever the app downloads query results from the server, it then has to deal with a snapshot of that data from when the server executed the query, and correctly merge it with any local changes the user may have. This is surprisingly hard to get right, and is really time consuming to develop.
Later versions of Ovvio dropped the server side approach in favor of a custom, client side, solution. We approached the problem like a distributed database would, and decided to maintain a collection of small indexes on every user's computer rather than one huge index for all users in our cloud. We could then run queries on the user's computer without any network involved. These indexes lived in-memory, and provided instant results to user interactions. No more progress bars, yay!
To pull it off, the app downloaded all records from the server and cached them locally in the browser's storage. Whenever a new tab is opened, it loaded all cached records and maintained a collection of binary trees in memory. These binary trees had a carefully selected ordered keys so that when it's time to execute a query, we'd simply run a zig-zag merge join and get immediate results.
This worked surprisingly well and enabled us to remove all progress bars, skeletons and spinners from most flows. You'd be hard pressed to find a spinner in Ovvio. In fact, if you do find one, rest assured we're constantly thinking how to get rid of it :)
Filtering in Ovvio became instant. Only now we had a new problem on our hands - startup time.
Constructing and maintaining all those indexes is no simple task, and Ovvio's startup times were horrible for large accounts. With older computers and large workspaces, Ovvio could take more than 2 minutes to open a new tab. This really shouldn't be surprising since indexing everything like we did is effectively pre-computing most possible filter configurations when the app starts.
Ideally we would just build our indexes once, save our work, and update it only when something changes. This however gets real messy real quick when dealing with multiple open browser tabs. Thus we needed to come up with a better way to do filters in Ovvio.
Slowing Down to Look Faster
Ironically, in order make Ovvio start faster, we had to make it slower. We could no longer afford to construct our indexes on startup, but luckily our users' data was less than anticipated. We thought we'd be dealing with ~50k notes and tasks per user, but the reality is more like 5-10k per user. Thus, we decided to move away from indexes and resort to linear search instead.
But wait, linear search is MUCH slower! How then can we still provide a smooth experience while using the slowest, simplest technique possible? The key here is to design for responsiveness rather than performance.
Using linear search allows us to easily pause a running query and resume work at a later time. So, while executing a query, we monitor how long it's taking, and if it takes more than 5.5ms (1/3 of a frame at 60 FPS) we pause and return control to the browser's event loop. We will then resume work in the next event loop cycle.
Using this technique, we multiplex all concurrent queries, ensuring our UI stays responsive while filtering through thousands notes and tasks.
The result of this design is that filters actually take longer to run than before. Ovvio spends more time filtering notes and tasks, yet users perceive Ovvio to work faster.
This technique enables us to easily show intermittent results while queries run, greatly increasing the app's responsiveness to user interaction.