i think its time we start switching to local or self-hosted llm systems where possible. granola for example. i probably can run transcription with a local model, and keep all data on my device. beyond just ai tools too. why am i still using google docs?
we are on the verge of a new internet.