Key points:
-
Cara’s Rapid Growth: The app gained 600,000 users in a week
-
Artists Leaving Instagram: The controversy around Instagram using images to train AI led many artists to seek an alternative
-
Cara’s Features: The app is designed specifically for artists and offers a ‘Portfolio’ feature. Users can tag fields, mediums, project types, categories, and software used to create their work
-
While Cara has grown quickly, it is still tiny compared to Instagram’s massive user base of two billion.
-
Glaze Integration: Cara is working on integrating Glaze directly in the app to provide users with an easy way to protect their work from be used by any AI
more about: https://blog.cara.app/blog/cara-glaze-about
Pixelfed looks like they are doing a huge push to get up to speed. It has been an immature app/platform for a long time and slow to get the features that people need from a photo sharing social media.
According to their mastodon, they are working for better AI management features, and launching an app that will make it a genuinely positive experience.
I really want Pixelfed to take off and this really could have been a moment, but after using it for more than a year now, I just can’t see it. Development is very slow - it feels like a one-man show (it might not be). We do need an alternative to Instagram, but yeah…
The official app is available in beta. I’m very impressed w it
Who. The fuck. Cares
I’ll be watching this curiously from a safe distance for now. I am interested in a new platform without AI, but this stinks of early-stage enshitification.
They have huge bills to pay already. It can totally lead to enshittification.
I don’t understand how this Glaze thing is supposed to stop AI being trained on the art.
It pollutes the data pool. The rule of gigo (garbage in garbage out) is used to garbage the AI results.
Basically, it puts some imperceptible stuff in the image file’s data (somebody else should explain how because I don’t know) so that what the AI sees and the human looking at the picture sees are rather different. So you try and train it to draw a photorealistic car and instead it creates a lumpy weird face or something. Then the AI uses that defective nonsense to learn what “photorealistic car” means and reproduce it - badly.
If you feed a bunch of this trash into an AI and tell it that this is how to paint like, say, Rembrandt, and then somebody uses it to try to paint a picture like Rembrandt, they’ll end up getting something that looks like it was scrawled by a 10-year-old, or the dogs playing poker went through a teleporter malfunction, or whatever nonsense data was fed into the AI instead.
If you tell an AI that 2+2=🥔, that pi=9, or that the speed of light is Kevin, then nobody can use that AI to do math.
If you trained Chat GPT to explain history by feeding it descriptions of games of Civ6 them nobody could use it to cheat on their history term paper. The AI would go on about how Gandhi attacked Mansa Musa in 1686 with all out nuclear war. It’s the same thing here, but with pictures.
Right but, AFAIK glaze is targeting the CLIP model inside diffusion models, which means any new versions of CLIP would remove the effect of the protection
It’s not. It’s supposed to target certain open source AIs (Stable Diffusion specifically).
Latent diffusion models work on compressed images. That takes less resources. The compression is handled by a type of AI called VAE. For this attack to work, you must have access to the specific VAE that you are targeting.
The image is subtly altered so that the compressed image looks completely different from the original. You can only do that if you know what the compression AI does. Stable Diffusion is a necessary part of the Glaze software. It is ineffective against any closed source image generators that have trained their own VAE (or equivalent).
This kind of attack is notoriously fickle and thwarted by even small changes. It’s probably not even very effective against the intended target.
If you’re all about intellectual property, it kinda makes sense that freely shared AI is your main enemy.
deleted by creator
I’m sure it works fine in the lab. But it really only targets one specific AI model; that one specific Stable Diffusion VAE. I know that there are variants of that VAE around, which may or may not be enough to make it moot. The “Glaze” on an image may not survive common transformations, such as rescaling the image. It certainly will not survive intentional efforts to remove it, such as appropriate smoothing.
In my opinion, there is no point in bothering in the first place. There are literally billions of images on the net. One locks up gems because they are rare. This is like locking up pebbles on the beach. It doesn’t matter if the lock is bad.
Saw a post on Bluesky from someone in tech saying that eventually, if it’s human-viewable it’ll also be computer-viewable, and there’s simply no working around that, wonder if you agree on that or not.
Sort of. The VAE, the compression, means that the image generation takes less compute; ie cheaper hardware and less energy. You can have an image generator that works on the same pixels, visible to humans. Actually, that’s simpler and existed earlier.
By Moore’s law, it would be many years, even decades, before that efficiency gain is something we can do without. But I think, maybe, this becomes moot once special accelerator chips for neural nets are designed.
What makes it obsolete is the proliferation of open models. EG Today Stable Diffusion 3 becomes available for download. This attack targets 1 specific model and may work on variants of it. But as more and more rather different models become available, the whole thing becomes increasingly pointless. Maybe you could target more than one, but it would be more and more effort for less and less effect.
if it’s human-viewable it’ll also be computer-viewable
Sort of. If you raise a person to look at thousands pictures of random pixels and say “that’s a fox” or “that’s not a fox” eventually they’ll make up a pattern to say if the random pixels are a fox or not. Meanwhile someone raised normally will take one look and go “that’s just random pixels it’s not a picture of anything”. AI is still in that impressionable stage. So you feed it garbage and it doesn’t know it’s garbage.
Not only is this kind of attack notoriously unstable, finding out what images have been glazed is a fantastic indicator for finding high-quality art that is the stuff you want to train on.
I doubt that. Having a very proprietary attitude towards one’s images and making good images are not related at all.
Besides, good training data is to a large extent about the labels.
Isn’t there already artstation.com? Just had a look at Cara and it looks very similar
Its Instagram mashed up with artstation
There was anti-AI art campaign a few months back by Artstation user due to Artstation allowing them.
So what happend when this app needs to pay server costs for 600,000 people?
Re: the hosting company
Your account does not appear to have spend management enabled, which would allow you to pause your project entirely if you hit a certain level of spend.
So, this is something of a devil’s bargain. Either shut down your website just as it’s catching fire and gaining traction. Or get billed a year’s server budget in a matter of days because of exploding costs.
In a saner world, this might be used as an argument for treating the Internet as a public utility and not a for-profit rent. Perhaps more companies could grow and sustain large pools of customers if they weren’t kneecapped by their own momentum.
Instead, I’m sure we’re going to see more exotic insurance and finance services designed to siphon money out of websites as a hedge against unexpected growth.
So what happens now? I doubt they have figured monetization right?
They basically have to just move off vercel. There’s a lot of other much cheaper alternatives though
OK, but they have a 90k bill now?
Can probably be massaged a bit
Massaging costs extra though?
Do you mind telling what this says ? it seems Firefox doesn’t load twitter anymore. Or maybe you need an account ? I’m not sure, but it says “error”
“Jingna Zhang @ cara.app/zemotion @zemotion So freaking speechless right now. Seen many @vercel functions stories but first time experiencing such discrepancy vs request logs like, this is cannot be real??”
I’ve heard from many ppl that vercel is pretty nasty that way, and to only use them for learning and toy projects.
Twitter no longer loads newer tweets if you’re logged out. Instead of showing a proper message, it either fails to load or redirects to the login page. They did that to prevent scraping.
According to their terms and service, everything uploaded to their website is then owned by them. Doesn’t seem very artist friendly to me.
Ok, the lady behind Cara just WON a f-ing copywrite lawsuit against some dick that stole her artwork. I’m 100% sure the wording is so if you *think* about stealing from Cara, she will come after your ass with both guns blazing.
Regardless, their terms of service let’s Cara not only sell prints and your artwork to third parties but also let’s them sell your artwork for AI training if they wanted to.
Instagram for all it’s fault specifically says that they don’t own your artwork and only get a license to show it.
I don’t really care what she won, people tend to cave really fast if given proper financial incentive.
No, it doesn’t. It states that the copyrighted works are the property of Cara and/or the artist who created the Works, except where otherwise noted. This specifically would cover cases where someone attempts to claim that a Work they found on Cara isn’t copyrighted because a copyright notice wasn’t explicitly stated, and doesn’t make explicit claims over the ownership of any arbitrary Work. For it to work in the way you’re claiming, the “or” cannot be present as it being there implies the existence of Works on the site which Cara does not have property rights to. Who actually possesses the property rights to any given Work is left, apparently intentionally, ambiguous.
cases where someone attempts to claim that a Work they found on Cara isn’t copyrighted because a copyright notice wasn’t explicitly stated
In what country is that a thing?
None that I’m aware of, but for a copyright to be asserted in the US a human must be associated with it as a consequence of the monkey selfie case. My reading is that this would cover the edge case of an anonymous, unknown poster submitting the work, allowing Cara to act as the default rights holder unless otherwise asserted by a person or user.
Why are you twisting it to make it seem like Cara is doing a good thing? What’s your motive? What is the difference between Cara owning it by default and the uploader owning it by default? Why can’t it just be the owners property?
Because “anonymous” isn’t necessarily a person who can answer for copyright. They literally gave you a use case where it could help in the content you’re arguing against…
It doesn’t work like that. The monkey selfie case did not set any kind of precedence. Animals cannot own property, including copyrights.
For a work to be under copyright in the US, it has to be an “original work of authorship” and contain “a modicum of creativity”. Some countries allow broader copyrights. Photographs that are accidentally triggered are public domain. CCTV footage is a gray area. Setting up a camera and luring animals into triggering it, might produce copyrighted images. A court would have to decide if the individual circumstances constitute authorship and a modicum of creativity. An animal snagging a camera and triggering it certainly doesn’t. The monkey selfie case did nothing to advance the law.
A public domain image is just that. Attempting to assert ownership over one is either an error or fraud. I don’t know what the US rules are when a rights-owner can’t be found. I doubt that you can just become the default owner of some property just by writing something on a website.
The monkey selfie case did not set any kind of precedence.
literally next sentence.
Animals cannot own property, including copyrights.
This sounds like a precedent…
That’s not true
???
The clause is literally a comment or so down and available on their website.
deleted by creator
Why did you specifically not put in bold this part and are the property of Cara. Clearly you saw it off you took the time to avoid putting stars around it.
the property of Cara and/or the individual artist
This seems worded to muddy the waters about who actually has the copyright.
Monetization plan might be to sell prints of platformed artists work, with out any need for pesky royalties.
…until they decide to sell their company. Or their user data. Or the shareholders say so. Or…
Thanks for the link. This is pretty much what I expected.
the crowdfunding/patronage of this platform only helps them build their proprietary empire. It’s like giving money to your neighbor who wants to build a swimming pool on their property because they promise you’ll be able to swim in it.
I knew that C looked familiar!
They actually seem quite a bit different. The one for Cara isn’t perfectly round and seems to suggest a person in the middle.
Yeah, they’re different, but “white circular C on a black background” just made me think of the CN one.
What is their monetisation plan? Currently they don’t seem to have anything other than donations?
You’ll have to ask the company that eventually buys them out!
I did and they said selling user data, promoting certain content, targeted ads, superchats, and a checkmark thing that costs five dollars a month that only 4 people get.
What’s a “plan”?
This will be the headline a month later:
Cara’s monthly active users down to a few thousands. Here’s why.
I’m no federated-nazi and I welcome projects like Cara, but at the beginning there are always lots of subscriptions
Yet another centralised social network. That pinky-promises they’ll never go bad.
Join now! Bring your friends! No ads! Everything’s free! We’re indie!..
Moments later… enshitification ensues.
Does it seem odd… This is a crowd that is all about “hands off muh property”. And yet they see nothing suspicious about someone giving them a free service.
Solves the problem for a few years until Meta buys their users and data back.
Assuming they don’t own them already as a sort of pressure valve. Yeah I’m getting that cynical.
Yep, this is just instagram again with a little anti ai image filter on top. And a portfolio, not a photo album !
If it’s not as interoperable as email, it belongs in the trash
Out of the frying pan, into the fire.
Sounds like another pan
🤷 all we have to do is keep moving faster than our waste stream.
Platform has cool ideas, gets users, gets greedy, gets infected with bots and scammers, users leave for new platform with cool ideas…
Accept the idea that you are not going to have a thirty year old Yahoo Answers account and even if you did you won’t be using it, and make peace with it.
This exactly. And also the more splintered similar user bases are, the better
More competition, less easy to enshittify a “captured” user base