Local LLM would be great to process these summaries, some llms are getting very powerful. Ollama itself adds complexity on the user side especially with installation friction/config files, but would be well worth it for the "no information leakage" a...