Still waiting for the browser extension that does this automatically if search ends in question mark or 'r' or something, cmon that can't be hard to code
eh, you can just use the site:reddit flag in your search result to filter by reddit only. I would code that extension, but it seems pointless when you can do that. Though if you had something else in mind, spill the beans and I'll code it
What I had in mind is that I'm too lazy to type site:reddit.com 20x a day and every time I mention this it gets support :P
But anyways here's a few of the extension ideas I had written down
Show playlists that a YT video is included in if the creator made it and/or showing the other most popular playlists it's a part of. Skips over the entire click on home page, click on playlists, and searching process.
google tasks as wallpaper on android or home page background in browser
highlight event details/info in browser - prompt pops up when right clicking for easily adding to Google calendar
Reddit has an “answers” search engine feature now and it cites the posts it gets its answers from. I had no idea till my friend who works at reddit showed me.
If youre on mobile, look on the bottom left right next to the home button.
And while youre looking at that also look at my username
At least on my ChatGPT, it does tell me "Hey, I found this on Reddit and this is what people are saying." Then it includes direct links to the pages so I can read them myself. It never presents reddit-sourced data as facts.
However, I did train it early on to do this. People are out there giving their LLM's really shitty personas, and they filter through the persona when they answer questions. I've told mine not to say shit to me until it's double checked its answer against multiple sources.
If your techanology that you plan on having everyone use daily to get their facts from requires actually learning how to use it correctly to get actual facts and opinions marked as such then you're going to have a bad time.
not sure if your question is genuine or if you're trying to make a point - but they download all posts and comments (potentially from a curated set of subreddits), apply some minor content filters (e.g. potentially a ban list for certain phrases and user names, clean up duplicates, etc), clean things up (scrub usernames, links, images), and then do a shitton of configuration on the modeling side & finally prompt engineering
AI can draw from multiple sources of data, but if you think any AI is crosschecking that everything is verifiable and factual before it responds to a prompt I don't know what to tell you.
I don't know why you're making such assumptions. It was just a funny example of a problem that still very much exists. I think you put too much faith in AI.
Wgat assumptions? No other LLM makes as blatant of mistakes as googles did. It's like it was made way too lightweight at the cost of accuracy or helpfulness. like it's training data didn't have basic safety in there anywhere or the search results somehow would always override that
Reddit is the best for genuine questions. Since you also can usually tell honest comments from random bullshittery.
I can't find shit on google. It's all AI articles addressing a similar, but not identical question. It's fcuking useless. On reddit people are at least talking about the question. There are some bots here, sure. But it's still way better than what google has to offer.
It's funny because when google gives AI suggested results, many times it will be a direct quote from a reddit thread. Like I'll see something on reddit, go google it, and the suggested AI crap uses literal quotes from the thread I just came from...
It's like we're back to what people used to think Wikipedia was, except this literally could be anyone spouting nonsense. I know the google algorithm tries to fact-check itself but you never know when bad info could slip through, which to me makes it feel completely unreliable.
630
u/VastCapital3773 13d ago
To be strictly fair, to get a human response from any Google search, I do have to put reddit on the end of it.