the article says they're specifically using generative AI though. there's no misconstruing what that means. and they're working directly with the larger companies that were fighting for extensive generative AI use before, during, and after the strikes (a cached version of their website reveals Disney, Netflix, and Amazon as just a few of their clients). a smaller company using it sets a precedent for larger companies to utilize it to the largest extent they can. within the bounds of their agreements with SAG-AFTRA, WGA, DGA, and the like, they're gonna do what's most cost-effective, and that includes foregoing creatives and using AI.
If you're considering spending actual time insisting on the value of AI in creative spaces right now, my suggestion would be, maybe rethink that
Does "generative AI" specifically mean actors, not all elements of the image, which may or may not include actors? I can't imagine the suits passing on cost savings cuz they're unseemly among media watchers, but I also don't think we're talking AI stars any time soon, so much as more ghastly extras and uncanny backgrounds. And eventually they'll be less ghastly and canny. It is what it is, it feels superficially similar to the convos about auto-tune and pro tools way back when. What is the best case "defensive" scenario?
AI will always be a hindrance because it is just mimicking things that came before. Maybe it starts with filling in the crowd at a fake game (although we saw how terrifying it was in that Disney movie Prom Pact), but sooner or later it is used to sub in for actors who are not available for reshoots, or maybe they decide to use it for some ADR lines. It needs to be opposed as often and as forceful as possible.
it goes beyond actors. any program that relies on existing work to automatically generate new work, audio, visual, or otherwise, is generative. studios look to this technology to save money they usually spend on other staff and creatives to do these tasks. there are many other graphic designers, special effects artists, audio engineers, composers, script writers, etc. that are willing and able to contribute to these projects, and frequent use of AI robs them of those opportunities. we need to stop giving studios and media companies the benefit of the doubt when it comes to a safe and fair use of AI. they've shown countless times that they're not above taking the cheapest and fastest option. otherwise, we wouldn't have had months of striking in the entertainment industry last year. even if you just wanna call AI a technical advancement and a tool in one's creative tool belt the same way that a computer was to a pen and paper, that doesn't change the fact that a) the output of AI usually isn't as good, and b) even with post-processing and editing to make something look better, that's still using someone else's time and intellectual property in a cheap, exploitative way! just hire someone else to do the creative work you're wanting an AI program to do, because it'll probably turn out better.
that reminds me too much of drum machines have no soul stickers I’ve done two pretty AI heavy album releases this year. The first one was Dandy Warhols: https://m.youtube.com/watch?v=H3H2AVm1uD8&pp=ygUNRGFuZHkgd2FyaG9scw== this video cost essentially nothing and crushed from a performance standpoint, giving this song a huge bump at radio, and the record ultimately charted #17 Billboard Tastemakers here and #22 in the UK. Taste varies, but from a creative perspective, I didn’t experience this project differently from any other creative corner of band shit. And the $10k we saved on that video is money deployed elsewhere in the project on things like remixes and more video. Likewise, I’m wrapping a project with another band, let’s call them the b hole surfers to keep it anonymous. For this one the band wanted to use an AI app to make a cover, and I had a very rewarding experience downloading the app as well, and playing around with prompts until they hit on what they wanted. Which we then handed off to our graphic designer. In this scenario, the job lost is a commissioned artist to fabricate the cover image. I don’t know how to weigh that against the band realizing their vision themselves In the case of the Dandys, AI and its oddity are very much a part of the image, not a hurdle. I don’t think this other album cover scans instantly as AI, but I guess we’ll find out later this year
The band you're supporting did what larger businesses want to do. cut costs, which equals to lost work for creatives.
idk man pretty easy to simply pay a working visual artist for their original art rather than do mental gymnastics to justify that AI is actually doing good in some way (it isn't) because you artists are able to spend that money elsewhere (vs you know, supporting someone's livelihood). Anyone using it is lazy and cheap. Doesn't matter what level you're at.
For me as an artist the joy of art is in the act of creating it and the vast majority of what AI is being used for in art is to replace the craft of art itself. As a consumer, I like stuff like Not Even a Show, where a comedian uses AI to prank shitheads, even BBL Drizzy is an undeniably catchy beat and sampling an AI generated song and making something out of it is less offensive to my views on art, but it doesn’t change the fact that the AI is scraping real art that someone actually made without credit or compensation, and that you could also write and arrange something with the intent to sample it and actually hire musicians to perform it which actually reinvests in the global or local art/music community in some way, which is not what AI or the vast majority of content creators using it do. As a consumer I still find the craft and form important to the art I enjoy most and when AI is used in the name of efficiency I just don’t respect it.
The thing is that as of right now AI replaces, not enriches or advances. We're too busy not giving a shit about our own people and money to stop this from going on. It's exactly what's going to be our downfall
A mid level band like Warpaint may not have the resources to hire a group of musicians, hire an arranger, book studio time etc. If it's a string quartet, it's fucked up to use an AI tool, is it likewise fucked up to record that arrangement with a synth? If they use synths, which also makes parts of the artistic process easier, and with the money saved, bring a lighting designer on the road, is that an ajudicable choice? Does that earn a side eye? Widening use cases, the AI Beatles song bites the big one cuz it sucks and it's stupid, but the Let It Be documentary rules, despite uncanny AI upscaling and probably because of the incredible AI audio/dialogue isolation. I know gobs of people who had totally authentic, creatively activating responses to that movie. I was in the studio with LJG and some of Lemuria when it came out and we had so much fun cuz we were all so stoked on how cool that doc is. Maybe the AI components of the movie had nothing to do with that joy, who knows, but it doesn't seem like a binary question
Yeah, that Dandy Warhols video looks like ass. Like so much of AI-created art, it has that weird level of gloss on it to cover up the fact that faces and words aren't correct. It's nauseating. I would be embarrassed to have a music video that looks like an ad for a mobile game.
Being honest, the question "is it okay to use AI to do this thing that I otherwise can't afford" feels like the same logic as "is it okay to crack this software since I can't afford a license" or "is it okay to download this song since I can't afford the album". At the end of the day, it's still taking money out of the hands of someone who would normally receive it otherwise and, if enough people do it, either forces people to raise prices to compensate or lower prices to compete, which in either case is destabilizing. I dunno, maybe that's extreme, but that's where my logic takes me.
AI actively seeks out and uses previously created work to imitate it without crediting or compensating or even seeking the original artist’s consent, a synth does not
usually someone created the sound and it comes with the interface you use, or it comes in a pack you buy etc. I’m not using synths created with generative AI to my knowledge, and would be bummed if I paid for a synth that was created that way but did not disclose it
there's definitely a power dynamic to it all. the biggest issue is generative AI in the hands of those with plenty of resources who can otherwise afford to pay creatives to make the same content. when those with fewer resources use it, it sets a precedent for executives and elites that says "if they can use it, why can't we? we can use better models and use them in a way that cuts costs usually reserved for paying others to do this". re: the Beatles thing, the technology used in the audiovisual restoration processes was a machine learning model, not a generative model. there's a big difference between using AI to restore preexisting content and generate new content entirely. the problem is not inherently technological advancements, but their use to replace tasks that actual humans are more than capable of performing, and can usually perform better.
Patches tbf but all of these examples are doing something markedly different than generative AI, where the entire function is to pull from previously created works without consent/compensation and scrape that data.