DiegoBeghin, 23 days ago @Alon @soycamo Right but you can imagine using an LLM to interpret the query but still output sources as a result. I'm pretty sure this is how search works nowadays
@Alon @soycamo Right but you can imagine using an LLM to interpret the query but still output sources as a result. I'm pretty sure this is how search works nowadays