Develop Yourself
To change careers and land your first job as a Software Engineer, you need more than just great software development skills - you need to develop yourself.
Welcome to the podcast that helps you develop your skills, your habits, your network and more, all in hopes of becoming a thriving Software Engineer.
Develop Yourself
5 Takeaways from Working at an AI Startup
We’re in a strange moment right now.
The AI bubble is showing signs of popping and the tech industry is waking up from a long night of partying.
We were told to “do more with less” and “just use AI” for the past 12 months.
After spending a year building production systems this way, I can tell you there’s a growing gap between what people think AI can do and what it actually does.
To say I’m annoyed with the media and talking heads online is an understatement.
It is really starting to boil my potato.
Here are my takeaways after grinding on a 5 person team to build a product using AI.
🚨If you're interested in learning how to REALLY work with AI as a software developer - apply for our new program for working developers who want to learn AI: parsity.io/AIDev
Shameless Plugs
Free 5 day email course to go from HTML to AI
Got a question you want answered on the pod? Drop it here
Apply for 1 of 12 spots at Parsity - Learn to build complex software, work with LLMs and launch your career.
AI Bootcamp (NEW) - for software developers who want to be the expert on their team when it comes to integrating AI into web applications.
Welcome to the Develop Yourself Podcast, where we teach you everything you need to land your first job as a software developer by learning to develop yourself, your skills, your network, and more. I'm Brian, your host. I just spent six months at a very fast-paced AI startup, and I quit. Today is my last day, and I want to share some lessons that I've learned working in this startup, working with AI in general, and about the future of interviews, software development, and what I've seen trying to hire my replacement and some of the worst, most rampant cheating I've ever seen. But before we get into that, I want to go through some of my major takeaways here because I think what I'm going through is representative of what a lot of software developers are going through. And if you're very new to the field, I think this could be the way that you'll be working in the very near future, which is a little bit different and in some ways a lot different than how we've worked in the past. We're in this really strange moment right now where everybody's basically being told to do more with less. We have layoffs, we have AI, and essentially CEOs are telling us do more with the workers you have. And then you have people online telling you nonsense, in my opinion, about the tech job market and how it's impossible to get a job or take whatever offer you can get, or AI has replaced the need for junior developers. I've talked extensively about why I think that's not true and also why the statistics and the numbers don't really prove out that story. Things aren't all roses and candy canes, obviously, but they're not as doom and gloom as many people would make you out to believe. Oftentimes the people with the loudest microphones have the worst stories because boring stories like, hey, I'm getting paid kind of all right in the job I'm okay with is a lot less interesting than, hey, I went from making$500,000 a year, now I'm working at McDonald's. But anyway, when I joined this really small AI startup, it was really at the height of the AI madness and hype cycle. And there is a pervasive myth that I want to pop. And I hope if you're a CEO listening to this or a junior developer that you really take heed to this. This whole you can do more with less because of AI thing is a ridiculous myth. When I joined this most recent team, it was basically me and a data scientist. We were told to move really fast, use AI. We're gonna basically vibe code this whole thing from scratch. And it kind of worked, actually. To be completely honest, we had a prototype that looked pretty damn cool and it really did feel magical. It felt almost like we could probably build this over a few weeks if we really put our minds to it, and maybe the tools really are that good. I honestly started to feel a lot less secure in my role as a developer. I thought, man, maybe I was wrong. Maybe the tools really are so good that we are going to be replaced. It took around six weeks or so, or maybe a little bit less, for that honeymoon to end. When we went from prototype into something in production, meaning something that we had in a demo environment to something that we actually wanted to show people or use that had some real value and real functionality, it became immediately clear that this was not going to work. And let me explain some very specific scenarios here. We had a data pipeline, pretty complicated, complex thing that costs money to run. It would ingest tons of data, it would use different APIs, which are not free and have rate limits. It was essentially the heart of our app. Without good data, you really don't have anything. Now, if you're producing this code that is ultimately responsible for gathering this data and you're trying to move it breakneck speed, well, you're ultimately just taking whatever the AI puts out to you. This is the definition of vibe coding. This is not vibe engineering or whatever other buzzword came out this week. The idea of vibe coding was essentially that you would prompt the AI, it would do the thing, you'd kind of make sure it kind of worked, and you'd move on. The problem with this approach, though, is that you generate tons of code, and the faster you go, the more you rely on AI to keep improving what the AI has already put out. This results in barely working, pretty sloppy code, and overall, AI code tools seem basically incentivized to give you the most amount of code to do something. And if you're a software engineer, you probably already know this, but code is a liability. Ideally, you want less surface area for bugs rather than lots and lots of code. Because remember, other humans will ultimately work with your code. And because you're spending actual human dollars on this service that is deployed out there in the cloud somewhere, you can't just blame the AI when things go wrong. So at some point, we basically had to rip this apart and start from scratch. Now, the AI tools did give us a very good baseline to start from, but if we had just kept going down this path, we would have slowed down tremendously. And I'll speak more about that slowdown in a minute. Here's another big takeaway. CEOs actually should vibe code, but they should never, ever touch real code. At one point, we told our CEOs, hey, write some code with ChatGPT. Use Vercells V0, which is essentially this like AI code generator tool that would spit out a whole app from a prompt, right? So they'd paste in their questions like, hey, how do I build a rag system? And they'd get this textbook style answer. Here's the big problem with some of the architectural decisions that a tool like ChatGPT can make. The answers that it gives sound really good in system design interviews, but real software is not built like a system design interview or like a textbook that you've read in school or like some dude on YouTube telling you how to build Shopify. In fact, Shopify, one of, if not the largest e-commer store in the world, runs on a monolith, not microservices. If you don't know what those two things are, look them up. In fact, maybe ask ChatGPT, what would it use if it had to construct a massive e-commerce store? Would it use microservices or a monolith and why? And you'll probably see the issue here. It'll probably tell you to use microservices when in fact, large companies like Shopify oftentimes use monoliths. And then you have companies like PayPal or NASA who use JavaScript or TypeScript on the back end. And if you listen to anybody on LinkedIn or Twitter, they'll tell you that this is the stupidest thing in the world. And nobody uses JavaScript on the back end. None of this stuff fits best practices, but some of the most popular and profitable companies in the world use these practices. Now I will say this having the CEOs vibe code or just prompt their way to a beautiful UI that didn't really work was super cool because then they could hand it off to us. We'd have some work in code, we'd at least have the CSS files, and we could go from there. But even like at the most basic level of junior developer, that is something that I wouldn't even give a junior developer a task to do on their first day, right? This is like simple stuff in a coding boot camp you might learn, or maybe in high school or college, or however you're learning how to code. This isn't really real deep work that's valuable, but it is really helpful at the same time to give your developer team this prototype saying, hey, here's a kind of not working prototype. Here's some working code. I just kind of want it to look like this. Here's what's in my mind, and now it's out here for you to look at. And maybe you can steal some of the code. You're like, cool, I can use 5% of this. Now, back to that AI productivity myth. AI made us slower, not faster. Now, this really confused me, and I kind of didn't want to believe it, or I thought maybe I'm just jaded, or maybe I'm trying to find a reason not to use AI or something like that. Maybe I just need to be aware of my own biases and just accept the fact that AI is the best thing since sliced bread, and we will use it and we will love it. And I'm sure some of you are hearing this and you're thinking, oh, you know, you just have a skills issue, or you should use this model. I hate when people say that. I'm on a team of pretty smart people. These people are much more senior than me. Some of them have been writing code for decades, some of them really, really bright people. Here's the thing we generated a lot of code with AI. And maybe this was our mistake. Because we were moving so fast, we had to generate a ton of code with AI. Now, at some point, you can't fully understand what the code is doing, why it's there, why is it so verbose, why are there 10 README's here? It's just, it's not meant for human consumption. Things were breaking constantly, and we were just moving too fast. The idea of just having AI take these prompts or even really detailed specs, we're having these rules documents and all the things that we tried to tame the AI just didn't work. We were really trying to vibe code our way into a production app. And what we saw eventually is that the number of errors that we were getting back and the number of broken things that were coming back, even with test cases, even with all these other guardrails, even with specs, all these things, they were just ultimately slowing us down. If I want to write a React component, for example, handwriting a React component might be slower than having an AI produce it until you have to debug a thousand-line-long file that has all these premature optimizations and things that make zero sense at all. Researching a database and reading actual articles online or going to documents takes longer than just asking Chat GPT, hey, what vector store and what cloud service provider should I be using? It also means that you're getting a lot more relevant information that a machine or an AI tool just won't know. Or even worse, just hallucinate. So counterintuitively, going slower and handwriting some of your code ahead of time or just using your brain to think actually saves a lot of time in the long run. And there are some studies that prove this as well. There's a famous study out there by some really smart nerds that shows that software developers who are using AI, senior ones, were 20% slower with AI than without it, even though they thought they were actually 20% faster. Go figure. Now, here's the worst takeaway here. I'm trying to find my replacement for this. And one, it's been pretty difficult to find people that have a unique mix of like full stack software skills like using Next.js and React, but also know a bit about AI stuff like integrating large language models, retrieval augmented generation, how to use structured outputs, writing tests for things that use large language models, building agents. It's not that these skills are so difficult to learn. In fact, they're really not. It's just that most people don't have any practical experience doing them yet because it's so new. And I see a massive opportunity for you out there if you're thinking of what do I learn next? And to me, it's simple. The thing that you learn next, quite obviously, is how to integrate AI into web applications. And beyond that, if you want something very specific to learn, learn RAG. I've been saying this for almost a full year now. Companies are kind of consolidating around this as the choice, the most practical use case for AI. If you're not learning RAG, if you're not learning how to integrate AI into web applications, and you're a software developer who mostly works on the front end, I think you're doing yourself a massive disservice. Anyway, we've interviewed around six people at this point, and what I've seen has honestly shocked me. Now we've met some really top-notch people. I've met some people that I'm like, whoa, you are smart and you're doing some cool stuff. Even them, they haven't had as much experience with some of these tools as we'd like, but we get it. They can learn, they're really smart, they're gonna figure it out. But at least two candidates at this time have just completely cheated. Here's the thing, though. We actually allow people to use AI during the coding portion of the exam. We say, just keep on whatever tool you want to use. You want to use Claude, you want to use Cursor, you want to use something we've never heard of. We don't care. All we want to see is you work through the problem and then tell us how you're thinking of the problem. Here's the thing. People might think, oh, that's easy. I would pass that. No, you wouldn't. I've seen what most people do. They'll just get an answer, plop it in there, and you ask them how it works, they have no freaking clue. Or you ask them something like, Well, what could go wrong? Or you say, What if I told you this constraint? How could we make this so complex that it's ridiculous? What if I increase the input by an order of 10? What happens then? What do you think about adding tests? What's your opinion on adding tests? Would you add a test in this scenario? Why or why not? These are things an AI tool just can't spit out for you. This requires actual human thought. And this is often where people just fail. Or they'll try to just keep reprompting until they can get something that looks like what we want rather than actually massaging the code into something that a human being or a software team would find acceptable. But anyway, we let these people use AI and they still freaking cheated. Here's how we can spot it instantly, in case you're even considering doing this, but I really hope you're not. This was really strange because in both the interviews where people cheated, we noticed these telltale signs immediately. You'd ask a question and there would be this weird delay. Like the person would say some random thing, like, oh, that's a great question. And then they would like get to answering the question. After you do that 10 times, it becomes a little odd. And they would kind of bumble with their words until they answered the question, and then it would be like a textbook perfect answer. At first, I actually was fooled and I thought, oh, this person really knows what they're talking about. They could be a great fit. Here's another telltale sign the eyes. When your eyes are moving around on a screen, it looks like you're reading something, or maybe there's a glow on your glasses, or maybe you have a reflection on your screen and we can see that you're reading something off a screen. This isn't a deal breaker immediately. I get it. People have notes and things like that. Not a huge deal. But here's where things got really, really weird. Well we'd ask them to explain the code. Now, one person in particular didn't know like anything. Like, hey, how does a back-end drought work compared to HTML and CSS? Hey, what is Const versus LET in JavaScript or something like that? This guy was completely frozen. I was like, this is very scary that he got this far. And here he is in real time crashing and burning. He just hung up the call in the middle of it because I was kind of pressing him on why he didn't know super simple things when it showed on his resume and application that he was supposed to know this stuff. He said he'd been writing React and Next.js for years and he didn't know the first thing about Next.js or React or even JavaScript or maybe even coding at all. The guy quit, pretty smart of him because we're going to end the interview anyway, and just kick rocks. All in all, people got through the first round of interviews, but no one made it past the second round to his cheating because it was very obvious in that second round, within about 30 minutes max, that we would see you're cheating. As soon as we started doing the coding portion, within about five minutes, honestly, it's completely obvious. Now, here's one other major sign is when their cursor would be doing all these strange things as you're talking to them. It's like, why is your cursor turning into a crosshair? Why is your cursor going over there when your code is over here? Anyway, if you're tempted to use these tools, don't. One of the ones I saw is$900. And you're not only going to get caught doing it, but you're going to get blacklisted from those companies from ever coming back again. There is a document we keep track of all this stuff. And imagine if we just wanted to name and shame them. Would that be so wrong? I honestly don't think so. It's actually pretty scary the fact that some people are out there trying to fool others to get into jobs they have no business getting into. And I wonder, where do you go from there? What happens if we do hire you? Are you gonna how are you gonna be on a five-person team faking it? And if you think, oh, I'll just use AI, well, that kind of tells me your seniority already. Because if you really think that AI could just get you through a five-person team or maybe even a few months into a job, or even a few weeks, I think you have a lot to learn. So, anyway, I know that our jobs are changing quite dramatically, but honestly, I think the biggest change out there is expectations. I think more than ever, developers have to know where the AI hype ends and where reality begins and help the CEOs, managers, team leads, non-technical people understand the reality of using these tools. They're not magic, they're not a replacement for people yet. They can be really, really good for things like prototyping or even doing code reviews in some instances if you really need a helping hand. But these are not things you want to let run wild in your code base because ultimately you're gonna build a large house of cards. And without a solid foundation, it will crumble and tumble over. And you will not only be moving slower, but you'll be moving slower and more dangerously because now you don't truly own the logic of your program. So if you're constantly hearing from leadership or people in your circle that you should just be moving faster because of AI, I know the CEOs want to believe this, and I'm sure there are some people that truly are seeing 10x gains. I think for the majority of us, though, that is a myth. It's a fantasy. The experiment has played out, we've seen how it goes. It is not going to happen. Maybe next quarter, though, maybe Sam Altman or Mark Zuckerberg or somebody at Google will release the true replacement for developers that they've been talking about for the last year. They have another couple months of the year left, so let's see where they go. But I'm a little bit doubtful at this point. I think we're going to get erotic chat bots before we get a replacement for software developers at this point. Anyway, hope that's helpful, and I'll see you right. That'll do it for today's episode of the Develop Yourself podcast. If you're curious about switching careers and becoming a software developer and building complex software and want to work directly with me and my team, go to parsity.io. And if you want more information, feel free to schedule a chat by just clicking the link in the show notes. See you next week.
Podcasts we love
Check out these other fine podcasts recommended by us, not an algorithm.