I released a new version of my #GoLang module for persistent message queues based on #SQLite today. I added a basic job runner, since that’s one of the things I most use a queue for. Check it out at https://www.goqite.com
Richard Hipp, developer of #SQLite, on a podcast, talking about open contributions in FOSS: "They come to you and say, “Well, I've got this cool feature. Why don't you just take it? I don't understand.” What they're really saying to me is, “Hey, I made this hack that's useful to me. Why don't you take it, test it, and maintain it for me for 25 years for free?”"
@vwbusguy
Suggestion:
import sqlite3 and use the SQLite date/time functions. No database required. It will be faster than shelling out to bash al the time.
Very nice tool for converting an #SQLite database into #MySQL: https://github.com/techouse/sqlite3-to-mysql. I'm planning to experiment with switching from SQLite to MySQL or Postgres (and maybe DuckDB?) for the feeds database and see how it behaves and how the performance differs… 🤔
It is a bit unfortunate that one can't backup to and restore a #SQlite database from a byte buffer. Otherwise one could also use it to send serialized data over the network 🙈
Okay, if you have 250 GB of disk space available, and a few gigs of RAM, #KomootPhoton is impressive. A query like localhost:2322/api?q=rathausstr+17, with no further qualifiers, bounding box, or whatever, returned after 1048 ms.
That's on my laptop. Searching over all of the addresses on the globe.
So, uhm, not to dunk on Photon, but I've just built a proof of concept #SQLite database containing all of the addresses in OpenStreetMap and created a full-text search index over them.
A query like "give me all OSM objects where any of the address fields starts with 'Frankf'" takes 36 ms and uses 11 MB of RAM.
That's a FTS5 trigraph search over 13.5 million unique strings and then joining them with nodes & ways.
The database takes 25 GB, compared to Photon's 230 GB.
I have a Sqlite database with 1 table. The table has 30 entries.
I feel like figuring out how much faster the Sqlite version is, with reads, updates and inserts, vs a text file.
@jamesog thanks for sharing this! I’m going to have to play with it. I have some complex #JSON cases that might really benefit. I also see promise in cases where today I might translate #XLSX to #CSV and then import into #SQLite. Why do all that if I can query the original directly? Awesome!
And to start of Monday, we have "This database engine does not support fulltext search operations.” when running local tests in sqlite, even though we have verified that fts5 is enabled.
@mjp@nicebyte@solidangle Decided to have some Sunday programming fun with #Vulkan and #sqlite. Took me less than 30 minutes to build a (very simplistic) sqlite based recording and replay system for my samples. I've been working with real time graphics and database for decades, but never really combined them in such a way. Combining those two very different things was a lot of fun :)
I had to implement push notifications on a project that had a particular constraint. Long story short, I can't (can but can't; don't ask) use my backend to store anything. But for this to work, I had to store the subscriptions to the notifications (again, don't ask). And... Enter #SQLite. There's a npm package ready to use and it works flawlessly, so I can have a full fledged, simple and lightweight SQL database right in my frontend. This solved my problem and I'm grateful for this to exist. :)