a writing assistant was one of the most requested features in our recent survey
Apparently, I am. People actually want this
For Proton Mail, 59% of respondents want an easier way to send end-to-end encrypted emails to non-Proton users, while 29% want a writing assistant for proofreading, grammar, and composing emails.
Nothing I hate more than not giving a link to the repo
Scribe relies on open source code and models, and is itself open source and therefore available for independent security and privacy audits
We built Scribe in r/ProtonMail using the open-source model Mistral AI to empower anyone in need of email productivity to use a privacy-respecting alternative to r/ChatGPT or r/GeminiAI that:
❌ doesn’t log or save prompts
⛔️ doesn’t use your data for training
🔎 open-source code that anyone can inspect
🖥️ can be run locally, so your data never leaves your device
Hello, thanks for your interest and kind words! Unfortunately we’re unable to share details about the training and the datasets (extracted from the open Web) due to the highly competitive nature of the field. We appreciate your understanding!
Thank you for recognizing this. It gets quite frustrating in threads like these about new AI tools being deployed when people declare “nobody wants this!” And I try to explain that there are actually people that do want it. I find many AI tools to be quite handy.
I tend to get vigorously downvoted at that point, as if that would make the demand “go away” somehow. But sticking heads in sand doesn’t accomplish anything except to make people increasingly out of touch.
I think that the point is it’s entirely pointless building something like this into the email system. It should be a separate system that you can choose to use if you want it. Building it in just opens questions about exactly what they’re doing with your data, despite their assurances.
For me the issue here is, why put so much time and energy into basically rebranding an LLM. I’ve seen LLMs running on RPi and android phones. Why not write a blog post showing how to run LLMs locally with existing tools for the best privacy instead and put more focus on their existing services. It just seems like they’re jumping on the AI bandwagon and charging a premium for an already freely available LLM.
I see some benefits of AI like quality tts when using OSM and stt when transcribing/translating audio but other things like Googles AI answers and Microsofts Copilot leave me scratching my head wondering why consumer would want this
Most people don’t have the tools or desire to figure out how to run an LLM locally.
What if I run a local LLM on my PC and I leave my home? Do I now need to learn how to deploy a VPN at home so I always have access? I could do this, but I don’t want to. Oh, you know a model that runs on Android? What if I have an iPhone?
Proton is a for-profit business that surveyed their customers and got feedback that customers wanted a writing assistant. This one seems the most important.
This is one of those situations. People would eat sweet food if they didn’t know it’s bad for them. In fact they still eat shitloads while they do know it’s bad. Same goes for opiates, and there’s a reason why they are regulated in most of the world (the usa of course is fucked up with this). So yes, let’s boil the fucking oceans so dumb fucks don’t have to learn how to write a fucking email. Let’s say people have to learn what kinds of shit are included with “ai”, they have to learn how much it actually costs, what all of the nft cunts are now doing, and then say if they want something or not. I mean, this is all just fucking gaslighting by now, right? I don’t care what a fucking ignorant shit thinks about mostly anything - but this is a shitty company telling us ‘you want it so hard baby’ so it can pump that ai investor hose SO HARD
The thing that pisses me off the most is that they are disingenuous almost to the point of lying in interpreting that survey’s results. They say that 75% of users are interested in GenAI, when actually what they asked is whether people have used any GenAI at all in the recent past. And that still doesn’t mean they want GenAI in Proton. That’s a pretty significant sleight of hand. The more relevant question would have been the first one on what service people want the most. In that case only 29% asked for a writing assistant, which is still not the same thing as a full LLM. The most likely answer to “how many Proton customers want an LLM in Proton Mail” seems to be “few”.
I think the philosophical concept of Open Source can’t really work in ML models unless the training data is open as well. As it stands, these “open source” models are still very much a black box. Nobody was really questioning the implementation of the GPT.
Yeah there are loads of ways to run LLMs locally, my issue is, why not just link to the source code if you’re going to say its open source and mention the model used. The open source AI scene also seems to never include the training data in their source code which is why I wanted to see if Proton would
Am I out of touch?
Apparently, I am. People actually want this
Nothing I hate more than not giving a link to the repo
Not on their support page specifically for it either
Had to got to Reddit and look at their comments to find out they’re using Mistral
https://reddit.com/comments/1e68sof/comment/ldsbs24
https://huggingface.co/mistralai/Mistral-7B-v0.1/discussions/8
Thank you for recognizing this. It gets quite frustrating in threads like these about new AI tools being deployed when people declare “nobody wants this!” And I try to explain that there are actually people that do want it. I find many AI tools to be quite handy.
I tend to get vigorously downvoted at that point, as if that would make the demand “go away” somehow. But sticking heads in sand doesn’t accomplish anything except to make people increasingly out of touch.
I think that the point is it’s entirely pointless building something like this into the email system. It should be a separate system that you can choose to use if you want it. Building it in just opens questions about exactly what they’re doing with your data, despite their assurances.
For me the issue here is, why put so much time and energy into basically rebranding an LLM. I’ve seen LLMs running on RPi and android phones. Why not write a blog post showing how to run LLMs locally with existing tools for the best privacy instead and put more focus on their existing services. It just seems like they’re jumping on the AI bandwagon and charging a premium for an already freely available LLM.
I see some benefits of AI like quality tts when using OSM and stt when transcribing/translating audio but other things like Googles AI answers and Microsofts Copilot leave me scratching my head wondering why consumer would want this
Probably because at the end of the day:
This is one of those situations. People would eat sweet food if they didn’t know it’s bad for them. In fact they still eat shitloads while they do know it’s bad. Same goes for opiates, and there’s a reason why they are regulated in most of the world (the usa of course is fucked up with this). So yes, let’s boil the fucking oceans so dumb fucks don’t have to learn how to write a fucking email. Let’s say people have to learn what kinds of shit are included with “ai”, they have to learn how much it actually costs, what all of the nft cunts are now doing, and then say if they want something or not. I mean, this is all just fucking gaslighting by now, right? I don’t care what a fucking ignorant shit thinks about mostly anything - but this is a shitty company telling us ‘you want it so hard baby’ so it can pump that ai investor hose SO HARD
Seconded
The thing that pisses me off the most is that they are disingenuous almost to the point of lying in interpreting that survey’s results. They say that 75% of users are interested in GenAI, when actually what they asked is whether people have used any GenAI at all in the recent past. And that still doesn’t mean they want GenAI in Proton. That’s a pretty significant sleight of hand. The more relevant question would have been the first one on what service people want the most. In that case only 29% asked for a writing assistant, which is still not the same thing as a full LLM. The most likely answer to “how many Proton customers want an LLM in Proton Mail” seems to be “few”.
I think the philosophical concept of Open Source can’t really work in ML models unless the training data is open as well. As it stands, these “open source” models are still very much a black box. Nobody was really questioning the implementation of the GPT.
Yeah this would be like Google saying Google Search was “open source” because map-reduce was open, or something.
You can run it locally, with GPT4All toolkit - along any other LLM you want.
Yeah there are loads of ways to run LLMs locally, my issue is, why not just link to the source code if you’re going to say its open source and mention the model used. The open source AI scene also seems to never include the training data in their source code which is why I wanted to see if Proton would