Confused on Steam Play and Proton? Be sure to check out our guide.
We do often include affiliate links to earn us some pennies. See more here.

Update 01/07/2023 - Valve sent over a statement here's what they said:

We are continuing to learn about AI, the ways it can be used in game development, and how to factor it in to our process for reviewing games submitted for distribution on Steam. Our priority, as always, is to try to ship as many of the titles we receive as we can. The introduction of AI can sometimes make it harder to show a developer has sufficient rights in using AI to create assets, including images, text, and music. In particular, there is some legal uncertainty relating to data used to train AI models. It is the developer's responsibility to make sure they have the appropriate rights to ship their game.

We know it is a constantly evolving tech, and our goal is not to discourage the use of it on Steam; instead, we're working through how to integrate it into our already-existing review policies. Stated plainly, our review process is a reflection of current copyright law and policies, not an added layer of our opinion.  As these laws and policies evolve over time, so will our process.

We welcome and encourage innovation, and AI technology is bound to create new and exciting experiences in gaming. While developers can use these AI technologies in their work with appropriate commercial licenses, they can not infringe on existing copyrights.

Lastly, while App-submission credits are usually non-refundable, we're more than happy to offer them in these cases as we continue to work on our review process.


Original article below:

Here's an interesting one on Steam publishing for you. Valve appear to be clamping down on AI art used in games due to the murky legal waters. AI art is such a huge topic of discussion everywhere right now, as is other forms of "AI" like ChatGPT and it's just — everywhere. I can't seem to get away from talk on it from people for and against it.

In a post on Reddit, a developer who tried to release their game on Steam got word back from Valve that they have denied listing it. Here's what they sent the developer:

Hello,

While we strive to ship most titles submitted to us, we cannot ship games for which the developer does not have all of the necessary rights.

After reviewing, we have identified intellectual property in [Game Name Here] which appears to belongs to one or more third parties. In particular, [Game Name Here] contains art assets generated by artificial intelligence that appears to be relying on copyrighted material owned by third parties. As the legal ownership of such AI-generated art is unclear, we cannot ship your game while it contains these AI-generated assets, unless you can affirmatively confirm that you own the rights to all of the IP used in the data set that trained the AI to create the assets in your game.

We are failing your build and will give you one (1) opportunity to remove all content that you do not have the rights to from your build.

If you fail to remove all such content, we will not be able to ship your game on Steam, and this app will be banned.

That developer mentioned they tweaked the artwork, so it wasn't so obviously AI generated and spoke to Valve again but Valve once again rejected it noting:

Hello,

Thank you for your patience as we reviewed [Game Name Here] and took our time to better understand the AI tech used to create it. Again, while we strive to ship most titles submitted to us, we cannot ship games for which the developer does not have all of the necessary rights. At this time, we are declining to distribute your game since it’s unclear if the underlying AI tech used to create the assets has sufficient rights to the training data.

App credits are usually non-refundable, but we’d like to make an exception here and offer you a refund. Please confirm and we’ll proceed.

Thanks,

Given the current issues with AI art and how it's generated, this really seems like a no-brainer for Valve to deny publishing games that have AI art unless the developers of the games can prove fully they own the full rights. Their own guidelines are pretty clear on it, developers cannot publish games on Steam they don't have "adequate rights" to.

That said, this is a difficult topic to fully address. With the tools Valve will be using to flag these games, how will they be dealing with false positives? It's not likely Valve will be individually personally going over every game with a human checking it, and algorithms can be problematic. It's going to be interesting to see how this develops over time. Seems like more developers will need to have everything they need ready to ensure they can prove ownership of all artwork.

I've reached out to Valve to see if they have any comments on it to share.

What do you think about this? Let me know in the comments.

Article taken from GamingOnLinux.com.
23 Likes
About the author -
author picture
I am the owner of GamingOnLinux. After discovering Linux back in the days of Mandrake in 2003, I constantly checked on the progress of Linux until Ubuntu appeared on the scene and it helped me to really love it. You can reach me easily by emailing GamingOnLinux directly. You can also follow my personal adventures on Bluesky.
See more from me
The comments on this article are closed.
All posts need to follow our rules. For users logged in: please hit the Report Flag icon on any post that breaks the rules or contains illegal / harmful content. Guest readers can email us for any issues.
63 comments
Page: «3/4»
  Go to:

Eike Jul 1, 2023
View PC info
  • Supporter Plus
Yes, this is good. Gotta get rid of the AI generated images (it is difficult to call it art).

It's obvious that this will not happen, right?

I do understand - and share - such feelings, but in the end, it's like trying to get rid of photos in the early stages of photography because they're "not art".
People said similar things about music sharing--where did it go?

Erm, I got no idea which similarity you're pointing at here. (Music sharing, like with Napster?)
Eike Jul 1, 2023
View PC info
  • Supporter Plus
Yes, this is good. Gotta get rid of the AI generated images (it is difficult to call it art).

It's obvious that this will not happen, right?

I do understand - and share - such feelings, but in the end, it's like trying to get rid of photos in the early stages of photography because they're "not art".
People said similar things about music sharing--where did it go?

Erm, I got no idea which similarity you're pointing at here. (Music sharing, like with Napster?)
Eike Jul 1, 2023
View PC info
  • Supporter Plus
This is not generally a good thing, in my view. Jobs will be lost and the quality of human creativity will be diluted and swamped by a tsunami of this stuff.

The strange thing is that now AI is painting and writing poems, while humans are doing hard work for minimal wages.

Wasn't this once supposed to be the other way around...?
Purple Library Guy Jul 1, 2023
Yes, this is good. Gotta get rid of the AI generated images (it is difficult to call it art).

It's obvious that this will not happen, right?

I do understand - and share - such feelings, but in the end, it's like trying to get rid of photos in the early stages of photography because they're "not art".
People said similar things about music sharing--where did it go?

Erm, I got no idea which similarity you're pointing at here. (Music sharing, like with Napster?)
Yeah. Not so much a conceptual similarity, just a similarity in, people sometimes think something is too basic or unstoppable to be regulated or outlawed, and then they regulate or outlaw it.
14 Jul 1, 2023
View PC info
  • Supporter Plus
Valve's handling makes complete sense from a large business who wants to remain in good legal standing and not have to potentially remove established games in the future. Better to prevent.

Some have mentioned worries about the mechanisms to detect AI generated art. Not something I'm worried about myself. I know it will not be perfect, just like other tools in the software industry. Will they use "bots" and algorithms to detect AI art? I'd be surprised if they didn't. Why wouldn't they? Software automation is the only way we can handle the scale of content we all use out of necessity or entertainment. It will never be perfect, just like when you buy a box of nails from the hardware store. There is often a bent or misshapen one. That's alright. You deal with a bump in the road and keep driving.

I'm not against AI generated art when legalities are met. AI is a tool just like other software that requires knowledge, skill, and time to use. If the art is ugly, it won't sell, which means a great amount of engineering (AI) and creativity of using the tool are required to get a desirable result. Like other tools of many kinds, you could produce good things or bad things.
Liam Dawe Jul 1, 2023
Updated with Valve's statement.
Grogan Jul 1, 2023
The hype would be quite a bit smaller if they weren't calling it "artificial intelligence", which it really isn't.

That's right, now it's essentially a bunch of machine learning algorithms that they are calling "AI". They also call the behaviour of NPC's in video games "AI" (and that's a stretch of wordsmithing that has always made me laugh)

But it's not always going to be that way. Right now it's a bunch of new legal issues that need to be sorted for when AI does become something a bit more sophisticated.
gradyvuckovic Jul 2, 2023
The recent news about AI generated images getting worse as previous AI generated images end up in the training data shows that it is still far closer to copying than actual creativity.
An intelligent artist improves by referencing their own work, not degrades.

I think this is the key. One of the biggest objections I have is the way many people have insisted there's 'no difference' between AI software having it's parameters tuned on a dataset of copyrighted images and 'humans learning art by studying the works of other artists'.

"They're both just looking at pictures and learning, what's the difference bro??"

And every time I hear that my first thought is, "Clearly you don't understand either AI or art".

Those two things are not even remotely the same thing. They really aren't. Humans look at examples of things, find patterns, learn, invent new things based on what they've seen, they experiment and try things, and stick with concepts they like or which they show to other people and are well received. None of the AI image generators in the market today do that.

The first artists who learned to paint and draw, and stylise humans with interesting representations, did so without any examples to look at at all. They just looked at other people, and experimented with different strokes and lines, and experimented until they found something they liked, and kept going with it and evolved their style over time.

AI image generators are certainly not doing that, they don't experiment, they have no sense of even 'liking' or 'disliking' what they're generating or being fed, a crucial aspect of how humans learn to do art and how new art styles are developed, so how could the AI be 'learning' the same way humans do?

All AI image generators do, is take a bunch of examples of data, and their algorithms are tuned to reproduce that data, and to combine it together based on keys. Combining the image data of blonde hair with image data of blue eyes to create an image of a person with both. There's no 'learning', there's no 'experimenting', and there's certainly no 'art'.

To me legally there's little difference between running 100,000 images through an AI image generator and claiming copyright over the result, and running 1 image through an image editor with a hue slider to shift the colour and claiming copyright over the result.

The only difference is, the latter example is infringing the copyright of one prior artist, the former example is infringing the copyright of likely thousands of artists. Both equally wrong.

AI image generators do have potential uses, such as a means of automatically generating variations of artworks automatically to fill a need for a large sum of procedural data, but AI image generators should not be used as a smokescreen to basically steal copyrighted works from artists, which seems to be the way they're being mainly used right now.

So I think it's very good to see Valve insisting 'If you're using AI image generators in your game, then you should be able to prove you have copyright over the images that were used to train the AI image generator'.
Kimyrielle Jul 2, 2023
"They're both just looking at pictures and learning, what's the difference bro??"

And every time I hear that my first thought is, "Clearly you don't understand either AI or art".

Those two things are not even remotely the same thing. They really aren't

Your point of AI being unable to invent truly new artstyles is correct, but largely irrelevant to the discussion. For the overwhelming number of games being made, their originality is largely not derived from the art assets. Visual elements are often used in a supportive manner, and it doesn't really matter if that elf wizard in an RPG isn't the most original art ever. If all I need is images to visualize my game, AI art will do the trick just fine in this regard.

Also, this point of AI art being essentially "stealing" from artists doesn't magically become true from repeating it just enough. Nothing is being "stolen", ever. The images are used for the training process and are then discarded. No fragments of copyrighted material remain in the published model. During training, the AI -does- learn to reproduce art-styles of existing artists. That's not stealing. If I draw an elf wizard in the style of say Clyde Caldwell, I don't violate his copyright, unless I draw an exact replica of one of his paintings. The paradigm of art-styles being not copyrightable has been affirmed in and out of court time and again, and is actually one thing artists should beg to not ever getting changed. If art styles would be copyrightable, Disney would probably need less than 24 hours to copyright every imaginable art style and no artist would ever do art again without their permission. I don't think that's what we want, no?

As far as training data itself goes, downloading publicly accessible images from the internet isn't illegal. That's also something some artists don't seem to comprehend. You cannot redistribute their images without their permission, but if you download anything, you can do whatever you want with it, as long as any copies or derived works don't leave your house/office. Copyright law restricts redistribution, not private use. In other words, if you don't want people to feed your images to a ML model, don't upload them to the public internet. Should be a no-brainer, but apparently isn't.
Oh, and using such data for machine learning is actually explicitly -allowed- by many jurisdictions (including the UK, home of Stable Diffusion). Even if this changes one day, it won't change the fact that any model released today is operating in the clear and images produced with them will remain so forever. You cannot retroactively criminalize behavior that's legal today.

The most laughable thing is the statement by Valve (supported by you) asking people to prove that you have copyright/usage-rights for your AI generated content, when the US Copyright Office clarified multiple times that such content is not copyrightable in the first place. How do you prove ownership over images that legally cannot have an owner, anyway?

This gist of the story is still Valve banning AI art on the sheer premise that the legal status quo might change one day, when there is very little indication that it will (the upcoming EU AI Act certainly won't, and there is no indication that the US has any intent to make rules dramatically different from that). It's not something I can support, but hey...
Purple Library Guy Jul 2, 2023
The most laughable thing is the statement by Valve (supported by you) asking people to prove that you have copyright/usage-rights for your AI generated content, when the US Copyright Office clarified multiple times that such content is not copyrightable in the first place. How do you prove ownership over images that legally cannot have an owner, anyway?
You've made some good points, but that's just a grammatical error on your part. Nobody's asking them to prove copyright of the AI generated images themselves. Rightly or wrongly, as far as I can tell people are asking them to prove sufficient rights over whatever the source material was, not over the results.

In the end I think the existence of these things represents a huge challenge to our whole model of copyright, both in itself and perhaps particularly the way in recent decades we have brought it as much as certain interests could into the model of property. That latter bit isn't so much a problem legally in itself, it's a conceptual problem.

So, let's not forget what copyright is, originally: It is a legal intervention in the world for the purposes of making our economic model viable in the realm of literary production (as far as I know, it was originally all about publishing books, not about art, for instance). And that is what its original justification was--making things work, not any inherent rights that anyone might have. As a side note, it was created mainly for the benefit of publishers, not writers.

As things like copyright became more important and at the same time there was ever greater potential for ordinary people to interact with it, such as by making mix tapes on cassettes, copies of videos, and then all the things the internet lets you do, corporations elaborated a rationale for making copyright more powerful and giving it greater moral force in people's minds--the idea of "intellectual property", which brings the whole capitalist, Lockean property schtick in. And so here we are, arguing about whether people's inherent rights to their "property" are being violated by the uses these "AI" programs are making of them.

And the thing is, quite likely not, but they could still break all intellectual production. As an instrumental, practical matter, "AI" could break the original rationale for copyright, by making it impossible for artists and writers to produce and get paid. At which point we're gonna need a law to stop it, whether the damage is relevant to people's so-called "intellectual property" or not. Whatever we end up with that we still call copyright, would have to be different and appeal to a different rationale--either a different ethical basis, or a spirit more in keeping with early copyright, of just saying we have to have a law so as not to break the economy of intellectual production.


Last edited by Purple Library Guy on 2 July 2023 at 7:01 pm UTC
Kimyrielle Jul 2, 2023
The most laughable thing is the statement by Valve (supported by you) asking people to prove that you have copyright/usage-rights for your AI generated content, when the US Copyright Office clarified multiple times that such content is not copyrightable in the first place. How do you prove ownership over images that legally cannot have an owner, anyway?
You've made some good points, but that's just a grammatical error on your part. Nobody's asking them to prove copyright of the AI generated images themselves. Rightly or wrongly, as far as I can tell people are asking them to prove sufficient rights over whatever the source material was, not over the results.


You're correct. I must have misread their statement. My bad!

Still, what they wrote in their statement is actually worse, because there is nothing "unclear" about using copyrighted data in ML data sets. As I said, multiple relevant jurisdictions explicitly allow it. No part of the "source materials" remains in the model. What Valve is doing is basically turning "innocent until proven guilty" into "you're violating copyright unless you can prove that aren't", in a situation where nobody can reasonably obtain such proof, or would be even required to.
So what do they expect you to do, honestly? Obtain written permission for every image linked to in the LAION-5B set? When there is no law or legal precedent prohibiting using copyrighted material for AI training in the first place? Ridiculous.

Their decision is horrible, not only because there is very little legal justification for it. But given Steam's near monopoly on the PC games market, it amounts to an industry-wide ban on AI art in games. Companies like Valve should apply a bit more thought and responsibility when making such decisions.

In the end I think the existence of these things represents a huge challenge to our whole model of copyright, both in itself and perhaps particularly the way in recent decades we have brought it as much as certain interests could into the model of property. That latter bit isn't so much a problem legally in itself, it's a conceptual problem.

So, let's not forget what copyright is, originally: It is a legal intervention in the world for the purposes of making our economic model viable in the realm of literary production (as far as I know, it was originally all about publishing books, not about art, for instance). And that is what its original justification was--making things work, not any inherent rights that anyone might have. As a side note, it was created mainly for the benefit of publishers, not writers.

As things like copyright became more important and at the same time there was ever greater potential for ordinary people to interact with it, such as by making mix tapes on cassettes, copies of videos, and then all the things the internet lets you do, corporations elaborated a rationale for making copyright more powerful and giving it greater moral force in people's minds--the idea of "intellectual property", which brings the whole capitalist, Lockean property schtick in. And so here we are, arguing about whether people's inherent rights to their "property" are being violated by the uses these "AI" programs are making of them.

And the thing is, quite likely not, but they could still break all intellectual production. As an instrumental, practical matter, "AI" could break the original rationale for copyright, by making it impossible for artists and writers to produce and get paid. At which point we're gonna need a law to stop it, whether the damage is relevant to people's so-called "intellectual property" or not. Whatever we end up with that we still call copyright, would have to be different and appeal to a different rationale--either a different ethical basis, or a spirit more in keeping with early copyright, of just saying we have to have a law so as not to break the economy of intellectual production.

I do agree with most of this, even if that underlying issue is a bit out of scope for the discussion. We need to think about how to encourage (and pay) artists in the future. We really need to, and I think lawmakers are already considering options. But this doesn't change the fact that as of today, AI art is legal. Using copyrighted images in ML training is legal. And a market-dominating company shouldn't make a unilateral decision to ban AI art from being used in games just because they can.
Kimyrielle Jul 3, 2023
err... people said that cars were great 100 years ago best thing since sliced bread, yet here we are fighting climate change,

Cars are still a fantastic thing. We need to update them with a new engine because climate change, but hey, details. They still do their car thing, even when they're battery powered. Maybe in 100 years, AI can draw people with less than six fingers per hand. Technological progress is always fun!


Last edited by Kimyrielle on 3 July 2023 at 5:22 am UTC
kokoko3k Jul 3, 2023
Don't you see contradictions here?
"We welcome and encourage innovation, and AI technology is bound to create new and exciting experiences in gaming."
How is an AI supposed to "create" or "innovate" when it just vomits associations of other work it has been trained upon?
Granted, the process "aims to be" similar to what the human mind does right now, but AI is still trained by humans with specific sets of "BIG" data.
Instead, human creativity comes from randomness, from caos, from living an entire, unpredictable life, which is different, individual.
What will be AIs trained upon from now to 100years? The need for "Bio"diversity comes to mind...

Let's try to stop this trend to make everything flat.


Last edited by kokoko3k on 3 July 2023 at 3:30 pm UTC
TobyHaynes Jul 3, 2023
The images are used for the training process and are then discarded. No fragments of copyrighted material remain in the published model. During training, the AI -does- learn to reproduce art-styles of existing artists.

I expect to see multiple legal cases where this statement gets tested.

Essentially, the original works being fed into the ML models are under copyright. Much of the art / film / music under copyright is also licensed under complex terms including residuals - the original artist gets paid when their art is used or shown, distributors get their cut, and so on.

If you take any work that is only legally available under license (i.e. most of it) and you use it in an AI training data set, what legal standing do you have? If you did not have the license up front, you used data that you did not have permission to use. You can not claim your model does not contain the source material in some form, because that was used for input and must remain in some transformed form, even algorithmic weights, in the ML model.

Valve is playing safe while the legal aspects are litigated. As a large international business, Valve's position is the only legally safe avenue.
Eike Jul 3, 2023
View PC info
  • Supporter Plus
How is an AI supposed to "create" or "innovate" when it just vomits associations of other work it has been trained upon?

AI is more "creative" than most humans.

These comparisons are getting ridiculous. Yes, computers are less creative than Pablo Picasso, less intelligent than Albert Einstein and every some million miles of driving, they are killing a person. But they can compete with an average human in all of these fields easily. Mots of us don't write poems or paint pictures at all.
Purple Library Guy Jul 3, 2023
How is an AI supposed to "create" or "innovate" when it just vomits associations of other work it has been trained upon?

AI is more "creative" than most humans.

These comparisons are getting ridiculous. Yes, computers are less creative than Pablo Picasso, less intelligent than Albert Einstein and every some million miles of driving, they are killing a person. But they can compete with an average human in all of these fields easily. Mots of us don't write poems or paint pictures at all.
Uh, they can't though. You have a point overall, but computer driving is pathetic. It looked great at the beginning and I was totally in for the idea, but it didn't get that much better and intractable problems didn't get solved. The "per X million miles" thing sounds good, but it's basically one of those "how to lie with statistics" things--they give a gee whiz number but don't make the comparison. The human accident rate is actually way better (It's quite surprising how good human drivers are, considering how crappy they seem when you're on the road). And that's despite the fact that they never, ever train these things anywhere with heavy traffic. And the computer driving systems still are absolutely incapable of successfully pulling off left turns in traffic. There's a reason the hype for computer driving has sort of quietly trickled away . . . they ran into diminishing returns and it's being slowly given up on. Even one of the top founders of computer driving has shifted to a company that just does it for specialized mining trucks on mining sites where the task is very simple and definable, because he concluded the general case just wasn't working.

Some day I'm sure they'll get it beat, but that will be a different generation of software based at least in part on different ideas. I think the same is going to be true of a lot of people's expectations for "AI" chat and so forth.
const Jul 3, 2023
The hype would be quite a bit smaller if they weren't calling it "artificial intelligence", which it really isn't.

That's right, now it's essentially a bunch of machine learning algorithms that they are calling "AI". They also call the behaviour of NPC's in video games "AI" (and that's a stretch of wordsmithing that has always made me laugh)

The AI paradox: Once it's solved, it's not AI any more. That always happens when people start to get used to computers doing things developed by AI research.

And about those thinking the *AI* lacks understanding and control: That's what the human providing the input and whatever is inside the model are providing - as a team, iteratively.
There are certainly people who master these tools in an artistic way. Postprocessing can also be a creative and artistic task - just like making a perfect photo of a mountain that was photographed a million times before.
How much creativity must go into a work to make it art? In my local art museum, there are pictures that are one-colored, two-colored or unpainted and slit with a knife. Beuys took a table, turned it around and put a piece of butter on top and now that's protected from overambitious cleaning personal for half a century while people discuss if that is art.
Also, real artists are drawing harry potters and yodas all the time and it depends on the work and opinion if this is derivative or not.
So, the problem is art-complete. If you want to decide weather ml-generated art is legit art, you could decide weather any work is art and there is no general solution to that problem.
That's why the debate is completely political and moral. I kind of like the everything generated is public domain idea, but if you ask me, that's not quite enough. If you ask me, the models should be force-published as public domain, too, unless every input has been licensed from the original creator, to ensure everyone can profit from this breakthrough humanity has worked on for decades.


Last edited by const on 3 July 2023 at 9:58 pm UTC
kokoko3k Jul 4, 2023
How is an AI supposed to "create" or "innovate" when it just vomits associations of other work it has been trained upon?

AI is more "creative" than most humans.

These comparisons are getting ridiculous. Yes, computers are less creative than Pablo Picasso ... But they can compete with an average human in all of these fields easily. Mots of us don't write poems or paint pictures at all.
I firmly disagree.

Creativity is not just writing nice poems, painting nice pictures or musical scores that meet the common taste.
Every human on earth is more creative than the better AI as of today.
Are you Picasso or an average human? You really think your creativity cannot match an AI?
Me certainly not and I'm not Picasso either.

Using an AI to make art is not being unable to do it by your own, it is just lazyness, easy and quick money, more or less just another step straight on the road to the flatness.
Eike Jul 4, 2023
View PC info
  • Supporter Plus
Are you Picasso or an average human? You really think your creativity cannot match an AI?

Yes, I fear I'm less creative at least in writing poems and painting than an AI.

I hope I'm still better (more creative?!? I don't know!) in my best field, software development.

But all of this seems to be a "Rückzugsgefecht" (dictionaries say it's "rearguard action" in English?) of mankind.
We're trying to defend what we think we're best in and what seems to be the "most human" trait to us, and it gives us strong feelings, and it makes us fear.


Last edited by Eike on 4 July 2023 at 10:15 am UTC
kokoko3k Jul 4, 2023
Are you Picasso or an average human? You really think your creativity cannot match an AI?

Yes, I fear I'm less creative at least in writing poems and painting than an AI.

I hope I'm still better (more creative?!? I don't know!) in my best field, software development.

But all of this seems to be a "Rückzugsgefecht" (dictionaries say it's "rearguard action" in English?) of mankind.
We're trying to defend what we think we're best in and what seems to be the "most human" trait to us, and it gives us strong feelings, and it makes us fear.
I understand your point and I'm all for the artificial life.
But i still think that we, as Natural Intelligences, are much more well trained and will be so for long time, because we make far more experiences; our training set of big data are much more vast, our potential is still so much bigger.

But,
if we start to rely on IAs, our set will stop to grow or will grow in a way that is driven by someone else woth much money in his pocket.

That's what I fear.
While you're here, please consider supporting GamingOnLinux on:

Reward Tiers: Patreon. Plain Donations: PayPal.

This ensures all of our main content remains totally free for everyone! Patreon supporters can also remove all adverts and sponsors! Supporting us helps bring good, fresh content. Without your continued support, we simply could not continue!

You can find even more ways to support us on this dedicated page any time. If you already are, thank you!
The comments on this article are closed.