community background

Just About

Just About
Dave's avatar

I have nosed in the EVE community occasionally and I had a suspicion some of the submissions were AI generated. Probably your best bet is to run the bounty through chatgpt a few times over and compare the content, style and wording. That said these models seem to run a few years behind so at least for the moment anything released post 2021 wont be an issue.

Dave's avatar

And to follow up, I agree that the WWW is going to crap and full of useless junk content. Search engines are becoming pretty bad unless you add reddit onto the search or only click on links leading to forums, i.e real people/customers/end users talking about what you are searching for. There is a real opportunity to provide a useful resource here with some really good timing in terms of this change we are seeing.

Rich's avatar

That's absolutely our thesis and the whole team has been struck by how widely it's shared. Thanks for your thoughts on AI! As you say, hopefully it'll be less prevalent for new releases that surpass LLMs' datasets.

E

Search Engine Optimization (SEO) is the root source of the enshittification of the web. A decade of people designing things to appeal to an algorithm has left us with a wasteland of uselessness. I honestly think that the only real solution is to abandon search engines as they have become and return to human triaged variants. I know that's basically an economic impossibility, but still, it is the best solution to the problem Min/Max-ers have created.

Dave's avatar

Any news on the progress of the AI detection systems and or policies? I think this is a critical key site feature/moderation policy that is going to make or break the platform personally to have trust in the content.

Joel's avatar

Hi Dave,

Our AI-assisted 'automod' tool is very nearly ready in its v1 form. As we refine and expand this it will help us to catch an assortment of unwanted content, including, over time, flagging AI-generated bounty submissions. We don't want to blanket ban this stuff, as Rich has outlined above, as we definitely feel like it can add value in a number of ways, but we do want to make sure we have visibility on it. Trust and fairness are key, and we're working and learning what tools can best help us make sure we're delivering those things.

PY

What had happened to game sites is just tragic. I remember having discussions with our editor at the time about how nothing should be created as clickbait or just because it was "game of the moment". The editorial had to be meaningful and have a purpose. It was hard to get eyes on because of people's reliance on google and the woeful search results. Nothing has changed, it's probably worse now than it was 8 years ago. When I see sites generating 80% of traffic from search I know that most of what's on the site is probably rubbish with a few articles that are decent from time to time.

I am now seeing so many articles that look AI generated that are just factually wrong and it's now another thing to combat against. The only way to fight it is to have a solid "real" community that discourages any articles that appear from AI sources.

Dave's avatar

Local news sites are even worse. There used to be a daily evening paper and a weekly one here made by a big team of people in a big office. Now it is literally just one person doing everything on our local "{$TOWN} live" site putting out stories that are just rewordings of the councils tweets, planning applications and what "news" people have put on the local facebook group with a clickbait headline, tons of ads(that most people block) and little content. It's fairly obvious when the one employee is on holiday as there is never any news those weeks.

Ad blockers killed off the revenue, then we got more and more ads to try and counter that, then you lose more and more people, now here we are with sites struggling for revenue streams turning away/putting off more and more of their readers/communities.

PY

You could argue that readers did it to themselves by blocking in the first place. Nothing on the Internet is free, someone somewhere is shelling out to keep sites online.

P

Now it is literally just one person doing everything on our local “{$TOWN} live” site putting out stories that are just rewordings of the councils tweets, planning applications and what “news” people have put on the local facebook group with a clickbait headline, tons of ads(that most people block) and little content. It’s fairly obvious when the one employee is on holiday as there is never any news those weeks.

Eurogamer does this all the time... Half of their content has been recycled / repurposed from unpaid "journalists" (who do the real work) on sites like Reddit, Neogaf, Resetera, Twitter.

It makes you wonder as to what "journalism" a lot of these paid / professional "journalists" are doing.

MrTomFTW's avatar

AI certainly could be a useful tool, but in the here and now we're not there yet. Certainly I do not consider it to be intelligence, as your statement says it tends to just regurgitate existing works - it does not take intelligence to do that.

At the moment AI is no different than taking the answers into an exam, or plagarising another's essay. Anyone using AI is essentially cheating by using another's work for (in this case financial) gain. As such I'd not want it used for bounty submissions, or to be honest used on JA in general.

Where AI goes from here I do not know. I can see an application for it as a template, a starting point as you say but certainly not as a replacement for human creativity. Especially in art where the people who use and promote AI seem to see it as a commodity for earning, a shortcut even, rather than a creative and expressive process. Ironically enough a mindset shared with NFTs/blockchain when that was the big tech obsession over 2021/22.

Brother Grimoire's avatar

Unfortunately, with the rise of AI tools there will be users that will abuse it.

In this case, we can combat those users with careful curation of what content is selected for bounties. Encouraging quality content will weed out the posters that just copy paste the response from a prompt.

AI should be respected as a tool, but shouldn't be a replacement for human creativity and passion

Lanah Tyra's avatar

Exactly, it's a great tool for certain tasks but definitely not for everything. I can imagine it being used by someone just getting into writing to see how an article should be built up, but definitely not to write the article itself.

I would rather see a short article from a newbie writer who is just getting into writing than a long incorrect crap wrote by AI. Quality over quantity.

MacGybo's avatar

I think I'm going to respectfully disagree with some of the opinions here. But I'll start with where I do agree.

Copying and pasting a AI text wholesale and submitting it doesn't seem right to me. There is a creative input from the user - the prompt. If the prompt is creative enough, then they have had an impact on the process, but not enough. They need to do more.

If, however, somebody is using it as a tool as a kicking off point to create something themselves, that's different. If it's used for inspiration and something to build on, then it's completely valid. And you can tell when people have and haven't done this.

Let's take my submission for the Eve Online fiction. I have had an idea for an Eve story for a long time, but haven't had the opportunity to write it. The premise was this : prior to the general release of Skill Injectors, there were prototypes doing the rounds. But they weren't ready. The people from whom the brain juice was extracted also left memories behind. They weren't filtered out, and those memories found their way into the people who took the injectors.

I took those ideas, and knew how I wanted it to pan out, broadly. I put those prompts to ChapGPT to see where it led. It didn't really work initially, but I changed the prompts, told it how I wanted the tone of the story to go. After a few goes, it started to lead to somewhere that I was a lot happier with. But it was still too dry. It was overly descriptive and lacked personal touches.

I took the 800 odd words and started to rewrite it. When it was finished, I'd say with honesty that it was a collaboration. It wasn't all mine but it certainly wasn't all AI. And what's absolutely clear is that nobody else on the planet - AI or Human - would have ended up with the same piece that I submitted. It was the equivalent of bouncing ideas off a rather well educated friend.

I'd say that there were parallels with musicians who sample. When hip-hop took off big-time in the early 80s, there were scores of people who berated artists who lifted samples. "They didn't write that tune". But that totally disregards what those artists did with those nuggets. They used them as building blocks. As a base.

A final point, and I'm circling round to where I started here. Having said all of the above, let's not underestimate the value of the human intervention that begins the AI process - the prompt. In ChatGPT or MidJourney, the prompt can be a powerful thing. The quality, originality and individuality of what you put in has a massive influence on what you get out. People sell AI prompts, such is their value. This human input at the beginning and the human editing at the end can lead to some really interesting work.

Let the heckling begin......

[Note. This post was not written with AI.]

MacGybo's avatar

A case in point is Douggy Pledger.

He was a MidJourney early adopter. His prompts are now his livelihood. He's had to publicly state multiple times that he can't / won't share what he inputs, despite lots of requests.

His work remains unique. You can tell it's him, but you can also tell that it's AI. The fingers aren't quite right. The faces in a crowd often melt. But nobody else is producing work like this. It's unmistakenly Douggy.

Dave's avatar

I get what you mean with midjourney and the images on the link. The risk with it though is they currently have a copyright lawsuit on the go along with others, depending on the outcome of that there is also a risk that users/publishers of its works could also end up in the firing line. Not sure if you have tried out Adobe firefly, the generative fill tool on that in photoshop beta is unbelievable how well it works. It's also entirely based off a licensed set of works that they have taught it off and they believe (although untested) that it makes it safe to use for enterprise/commercial use (once it comes out of beta), so much so that they are going to provide an indemnity clause.

https://www.theverge.com/2023/1/16/23557098/generative-ai-art-copyright-legal-lawsuit-stable-diffusion-midjourney-deviantart

https://techcrunch.com/2023/06/26/adobe-indemnity-clause-designed-to-ease-enterprise-fears-about-ai-generated-art/

Dave's avatar

In my mind the usage you describe is not a problem. A while back I had an idea for a group here and I couldn't really explain it very fluently. I put my ramblings about it into chatgpt and asked it to present the text in a more summarised and interesting way with better choice of words and structure etc, then I took this as a structure and used it to reword again to what I was trying to say. Using it like this and in the way you describe I think is more akin to a very advanced checker in a word processor e.g Grammerly and some of those sort of things come to mind. The way you used it to flesh out/reword the key ideas about the story sounds like a similar usage in my mind.

I don't often visit the EVE community as I'm not a member but just to clarify it wasn't the story one that I thought "hmmm" about (and I haven't reported anything as really I have no idea).

MacGybo's avatar

That's interesting that my story didn't prompt an immediate reaction of 'that's AI'. Thanks for that, and your Grammerly analogy makes sense.

Related articles
Curated
Curated

Communities

There’s more to love

Help shape the future of our platform as we build the best place to express and enjoy your passions, whatever they may be.

Emoji

© Just About Community Ltd. 2024