Bring your slop to work day

May 28, 2025

Yellow sunflower

Photo by Lukas.

Earlier in the year, our fourth grader was working on a big science project. Every kid in class had to design an experiment with either light or sound. They got to choose. Our kid chose light.

The first step was to document the design of the study. Then they had to capture a hypothesis, writing out what they thought would happen and why. Next up, they had to conduct the experiment, document the results, and reflect on their initial hypothesis and capture the learnings. It was a cool project in that it was both a science project and also a thinking project. 

We got to the end of the experiments. The child was writing up all the findings and going back to their original hypothesis to see how well they fared. This part of the assignment happened in Google Classroom. While bringing the kid a snack, we peeked over their shoulder. For every word they typed, Google tried to anticipate the next word.

We weren’t sure if they knew what was happening so we asked, “That thing where the greyed out words are coming up, is that something you turned on?”

“Huh?” They looked up from the keyboard, where they were hunting and pecking. “Oh that. No, that’s on in all of the docs we use for school. I don’t know how to turn it off but it’s annoying.”

There’s an old joke in tech that goes: while everyone else’s children play with apps, the children of technologists play with rocks. The assignment was a good one. Think about a problem, start with a hunch, validate or invalidate the hunch with data, reflect. These are important skills. And they show up everywhere, not just in science.

But if, instead of thinking, our kid had hit the tab key 50 times, they probably would have generated something passable. Brilliant? No. But so few fourth grade science projects truly advance the corpus.

Advancing the corpus

We’re hearing more about AI at work. Less about the places it’s truly advancing the corpus and more about the places you can hit the tab key 50 times and get something passable.

The fun part is to compare notes cross-industry. To see where the waves of slop are lapping against the shore. And, to be clear, this is not us saying that everything is slop. This is us saying that some of it is slop. And slop is popping up everywhere.

Recently, a friend of ours was leaving a senior creative role. On his way out the door, his agency asked if he would train AI on his process so that even after he left, they could still apply his approach to their client work. He declined.

We shared that we see similar things in our line of work. That there are folks who deliver management training by generating a script and a deck and then feeding both through a teleprompter. And that we’re starting to see entire conferences full of talks generated the same way.

He thought about this and then said, “Do you really hate the work that much? Like how much do you have to hate the work to skip all the parts that go into making something great?”

We talked about our love for the early, messy bits. Where you check your assumptions, validate or invalidate your approach, and get to something that truly moves things forward. Or not. But either way, how the process itself is an important refinement tool.

A significant negative correlation

Any time you talk about slop, people are quick to point out that AI has real and practical use cases. And that lots of people were meh at their jobs before AI. And that lots of work is mediocre. In a toss up between mediocrity without labour protections and mediocrity with them, obviously team robot wins the point. Besides, while you might sentimentalize all that grunt work and process, the counterargument is that handing off your low-value work frees you up for more important stuff.

Now, inconveniently for the AI advocates, the bots are not currently great at this. We know, we know, the companies selling chatbots want us all to know that it passed its LSATs or whatever. But if you’ve ever spent any time supervising a GPT-style bot in a professional context, you know that the frequency of errors is so high, and so consistent over time, that any human performing that poorly would take a long walk off a short PIP. But in theory, that’s the promise. Use the bot, it will do that work so you don’t have to.

What’s maybe less obvious is the impact that has on you. Like, yes, for sure there’s the finding that prolonged use tends to result in emotional dependence and mental health issues. That’s not an ideal research finding. Maybe not a perfect workplace safety risk for your employer to be compelling you to use. It’s definitely giving some low-key asbestos vibes. But, like asbestos, the pitch has always been that the risk is worth the pay off. That offloading that low-value stuff is necessary to give you the room to focus on the big meaty thinking. The bummer is…how to put this? Offloading doesn’t work that way.

“The findings revealed a significant negative correlation between frequent AI tool usage and critical thinking abilities, mediated by increased cognitive offloading … they may inadvertently diminish users’ engagement in deep, reflective thinking processes.”

Whoops. Deep, critical, reflective thinking genuinely is essential to your job as a leader — it’s what we pay you for. And whatever the sales pitch was that got you to try this tool, or got your boss to force it into your organization, the real-world impacts are pretty clear.

The funny thing (but not ha ha funny) is that this research has been out for a while, but it’s usually met with a shrug. So much of the day-to-day discourse around AI use at work is shruggy. ChatGPT is writing your coworkers’ emails. Or their performance reviews. Or their conference talks. shrug. It’s all extremely mid. Mediocre AI summaries of mediocre AI emails written about other mediocre AI summaries. shrug. And if you raise your hand to ask why we’re okay with this much slop at work. Why we’re okay with how disrespectful and literally soulless it makes our craft and our company. You’re told that everyone’s doing it. That you’re either naive or a luddite to fight against it. That you have to keep up or risk falling behind. That if you want more humanity, or critical thinking, or meaning in your work, maybe you just need to write better fucking prompts for the AI. shrug.

This is not what excellence looks like

We’ve been talking with more people lately who are really struggling to find meaning at work. Who feel like they can go through the motions but aren’t really sure what it’s leading to or why they should care. They do care, but they feel like they’re being punished for caring, and that their work would be easier if they didn’t. They see friends and colleagues who have already embraced not caring, but they themselves are still raging against the dying of the light.

It’s hard not to see these things as related. Because we already have a word for this. For what happens when deep, reflective, critical thinking and engagement is diminished. When everything turns into a shrug. And nothing really rises to the level of mattering very much. In other contexts, we call that thing depression.

You shouldn’t want a corporate culture that depresses people. The vitality of a group of talented people lit up about a thing, supporting each other, bringing their all to it because they’re counting on each other — that is a fucking incredible feeling. And you can tell us it’s sampling bias if you like, but we have had very few leaders come to us and tell us how much better their team is working, how much more vital and excellent they feel, after involving a GPT.

It’s not that it can’t happen. Lots of things can happen. It’s just that we’re hearing from more folks who sound depressed than lit the fuck up right now. And also, importantly, the people whose children play with rocks all sound a way at the moment.

We’re not saying hide from it. Despite the fact that it lies. Despite the fact that it cooks the planet, and is built on a pile of stolen creative work it could never equal. Despite the fact that its fans have been telling us that we’re 6 months away from having the problems solved, for the past several years. Hiding from it doesn’t help because there are too many people who have bet their companies and careers on its constant and relentless ascent. It’s here to stay, one way or another.

What we’re saying is to understand it. Understand that the people who tell you it will make beautiful art don’t know how art is made, and the people who tell you it will build thriving teams don’t know how thriving teams are built, either. That if you’re someone trying to figure out whether it’s a chump move to give a shit, to write your own emails, to sweat your own presentation, to grapple with the difficulties of doing your job well — that’s not an imperfection to be smoothed out, that’s the work. And we don’t need you to hate it.

— Melissa & Johnathan

Upcoming Programs

Actually good, actually useful training

Build Something Better

Subscribe to our free, biweekly newsletter

This field is for validation purposes and should be left unchanged.