diff --git a/transcripts/528-python-apps-with-llm-building-blocks.txt b/transcripts/528-python-apps-with-llm-building-blocks.txt new file mode 100644 index 0000000..9a8e9ab --- /dev/null +++ b/transcripts/528-python-apps-with-llm-building-blocks.txt @@ -0,0 +1,2816 @@ +00:00:00 In this episode, I'm talking with Vincent Wormerdam about treating LLMs as just another API in your + +00:00:05 Python app with clear boundaries, small focused endpoints, and good monitoring. We'll dig into + +00:00:11 patterns for wrapping these calls, caching and inspecting responses, and deciding where an LLM + +00:00:17 API actually earns its keep in your architecture. This is Talk Python To Me, episode 528, + +00:00:23 recorded October 23rd, 2025. + +00:00:44 Welcome to Talk Python To Me, the number one podcast for Python developers and data scientists. + +00:00:48 This is your host, Michael Kennedy. I'm a PSF fellow who's been coding for over 25 years. + +00:00:54 Let's connect on social media. + +00:00:55 You'll find me and Talk Python on Mastodon, Bluesky, and X. + +00:00:59 The social links are all in the show notes. + +00:01:02 You can find over 10 years of past episodes at talkpython.fm. + +00:01:06 And if you want to be part of the show, you can join our recording live streams. + +00:01:10 That's right. + +00:01:10 We live stream the raw, uncut version of each episode on YouTube. + +00:01:15 Just visit talkpython.fm/youtube to see the schedule of upcoming events. + +00:01:20 And be sure to subscribe and press the bell so you'll get notified anytime we're recording. + +00:01:24 This episode is brought to you by Sentry. + +00:01:27 Don't let those errors go unnoticed. + +00:01:28 Use Sentry like we do here at Talk Python. + +00:01:30 Sign up at talkpython.fm/sentry. + +00:01:34 And it's brought to you by NordStellar. + +00:01:36 NordStellar is a threat exposure management platform from the Nord security family, + +00:01:41 the folks behind NordVPN that combines dark web intelligence, + +00:01:45 session hijacking prevention, brand and domain abuse detection, + +00:01:50 and external attack surface management. + +00:01:52 Learn more and get started keeping your team safe at talkpython.fm/nordstellar. Hey, welcome to Talk Python To Me. Hi. Welcome back. Pleasure + +00:02:01 to be here. Yeah, it's been like, this is our second Talk Python interview, I think. We've + +00:02:06 interacted before also on socials and all that, but like, yeah, second time. Yay. Happy to be here. + +00:02:10 Yeah. Yeah. It's very exciting to have you back. I, you know, I'm trying to remember, I did an + +00:02:15 blog post on the most popular episodes + +00:02:20 of last year. + +00:02:22 I think, if I do recall correctly... + +00:02:25 Yeah, the number one last year was our + +00:02:28 interaction, I think, on Spade. + +00:02:29 That's not terrible, is it? + +00:02:31 Yeah, it was a bragging right for a day. + +00:02:34 Yeah, it's pretty cool. So your episode and double bragging + +00:02:38 rights as your episode was released in, I don't know when that was, + +00:02:42 something like October or something, almost exactly a year ago to the day. And still it was the number one most downloaded for the whole year. So + +00:02:50 fantastic. And I think that might be an indication of where we're going to another very exciting + +00:02:56 topic, you know? Yeah. Like ML and AI only went up, right? So, so. + +00:03:03 You know what? I would love to get your thoughts on this. I, I'm seeing out on the internet and I + +00:03:08 want to put this up as kind of a, to counter this, this thought or whatever. But, but I'm seeing a + +00:03:13 lot of people get really frustrated that AI is a thing. You know, OpenAI just released their Atlas + +00:03:21 browser, which I'm not super excited about, but it's like their version of what an AI browser + +00:03:26 wrapping Chromium looks like, you know what I mean? And it's fine. But the comments in Ars Technica, + +00:03:32 which are usually pretty balanced, are just like people losing it over the fact that it has AI. + +00:03:37 Are people tired of this stuff? Or why do you think that's the reaction, I guess, is really what I + +00:03:42 want to ask you i mean i guess i have like a couple of perspectives but like i do like i think i could + +00:03:47 say like we are all tired that like every tech product out there is trying its best to find a + +00:03:52 way to put ai in the product like oh any any gmail any calendar app let's put ai in there and it + +00:03:58 gets a bit tiring like i wouldn't be too surprised if like fire hoses and refrigerators also come with + +00:04:02 ai features these days like it feels a bit nuts i know it's like what what is my refrigerating ai for + +00:04:09 Yeah, or like soccer ball of whatever object, like it feels like people are overdoing the AI thing. So okay, that can be like legitimate source of frustration. But then it also becomes a little bit personal, because I can also imagine if you put like, your heart and soul into like being a good engineer, and like you want to take the internet serious, like you consider it something is somewhat sacred, because it helped you so much in your career, then I definitely can also imagine like, Oh, my God, please don't turn that into a slop. Like I built a career on that. So that might be the source of all these things. I guess like my main way of dealing with it, + +00:04:39 is more along the lines of like, okay, like, this genie is not going to go back into the + +00:04:43 ball in a way. But if we're able to do more things with these LLMs, if they're able to + +00:04:47 automate more boilerplate and that sort of thing, then it seems that then I can try and + +00:04:52 get better at natural intelligence as well, instead of this artificial intelligence. So + +00:04:56 like, okay, I can vibe code all these things now. So the bottleneck is just do I have good + +00:05:01 ideas? And maybe I should invest in good ideas. And oh, maybe I can actually learn about + +00:05:06 JavaScript, but then I should not focus on the syntax, but it shouldn't focus maybe on + +00:05:09 the browser and like how does CSS work and oh I should actually maybe do a little bit of stuff + +00:05:15 with flashcards just so I know the syntax just enough so I can review okay like I try to be very + +00:05:19 conscious about it that way that's kind of my approach more and I it's easy to get really lazy + +00:05:24 right and just push the button and say do the next thing and not use it as an opportunity like oh it + +00:05:28 did something I didn't know let me have a conversation and try to understand the uh I + +00:05:33 didn't know I think the key word here is you want to be deliberate about it like I think if you can + +00:05:37 sort of say like okay i'm deliberate about the way i use this stuff and i'm also learning as i'm doing + +00:05:41 this like one thing i really like to do is i just give give like some sort of bonkers front-end task + +00:05:46 to the llm that i have no idea how i would solve it and then study the output like that's the thing + +00:05:50 i actually do once in a while um but yeah i mean i do get it that like people have mixed emotions + +00:05:56 about it and that's totally cool and fine it's just that for me in my situation this is how i + +00:06:01 deal with it yeah i think that's fair i think also i i definitely have long since been tired of like + +00:06:07 every email app having to rewrite this with AI and it usually destroys the email. + +00:06:12 You know, like removes all the formatting. + +00:06:14 And then you just, you know, there's always the cartoon of like, + +00:06:17 I wrote an email as bullet points. + +00:06:19 I asked AI to expand it. + +00:06:20 I sent it to you. + +00:06:21 You're like, what a long email. + +00:06:23 I'm gonna ask AI to summarize this. + +00:06:25 We should have just sent the bullet points. + +00:06:27 - Yeah, I mean, I do fear the amount of slop though. + +00:06:30 Like I do, it almost feels like every good tech YouTuber, + +00:06:33 there aren't a lot of good tech YouTubers anymore they're all doing AI stuff instead of doing like actual really great programming. There's still a + +00:06:40 few. Anthony writes code is like a really good one still. Like he's awesome. Definitely follow + +00:06:45 him if you want to learn Python. He does cool stuff. But yeah, the main thing, I guess from my + +00:06:51 like selfish perspective, YouTube used to be better because nowadays it's all about the AI and you've + +00:06:55 always got this thumbnail of a guy pulling his hair out like, oh my God, this changes everything. + +00:06:59 Yeah. I would love to see a video that's maybe not that. + +00:07:03 Have you heard that there's an AI bubble that's going to burst? + +00:07:05 um yeah well we'll see exactly how far i think so actually i mean we can speculate like i i can + +00:07:12 argue that it's been over invested i can also argue there's still so much potential the one + +00:07:16 thing i will say though um we have clod kinds of tools and like we're all looking forward to like + +00:07:21 the vibe coding oh it's never been easier and that sort of thing but i would have expected an explosion + +00:07:24 of awesome apps to be going along with it and it doesn't necessarily feel like we have much better + +00:07:29 software even though we supposedly have way better tools so something about that is making me a little + +00:07:34 a bit suspicious, but it might also just be that the apps that are being built that I'm not the + +00:07:38 target audience for, because I can imagine that. Because let's say you have something awesome for + +00:07:42 dentists, they can still be an awesome crud app that you could build with Claude, but I'm not a + +00:07:46 dentist. So, you know, wow, there's a lot of new cool dentistry management apps. You're right. + +00:07:53 If a dentist can outprogram, right? Like I do believe in the story that, oh, as a dentist, + +00:07:58 you might be best equipped to write an app for dentists, right? And if they're now more empowered + +00:08:02 to maybe do some stuff with code. + +00:08:04 I mean, there's a story there that every single niche profession + +00:08:07 will have someone who can do enough clod to make the app for that profession. + +00:08:12 Yeah. + +00:08:13 Well, time will tell. + +00:08:13 It used to be that you have to have a programming skill + +00:08:16 and a little bit of, let's say, dentistry experience to build the right sort of specialized vertical. + +00:08:21 And now I think it maybe is reversed. + +00:08:23 You need a lot of dentistry and a little bit of coding skill to go along with an AI. + +00:08:28 Yeah. + +00:08:29 The character sheet used to be 80 points here, 20 points there. + +00:08:32 and now it's 80 points there and 20 points here. + +00:08:34 Yes, exactly. + +00:08:35 That's exactly what I was thinking. + +00:08:36 Like, I think maybe that's actually shifted until you said that. + +00:08:39 I've never really thought about it, but it just may be. + +00:08:41 I do want to point out for people listening, we're not going to really talk too much + +00:08:47 about like using agentic tools to write code for us. + +00:08:51 Like we have been speculating about this theoretical dentist. + +00:08:55 But what instead we're going to do is we're going to talk to you, Vincent, + +00:08:59 about how can I use an LLM like a library or an API to add functionality, features, behaviors to an existing data science + +00:09:08 tool or to an existing web app or whatever it is we're building, right? + +00:09:12 Yes. + +00:09:13 So also people might probably know it, but I made a course for Talk Python, of course, + +00:09:18 and we're going to talk about some of those details. + +00:09:21 But the idea for that course was not necessarily like, how can I use LLMs to build me software? + +00:09:26 It's more, oh, how can I build software that also uses LLMs under the hood? + +00:09:29 Like if I have an app that makes a summary, how can I make sure the summary is reliable? + +00:09:33 And basically get the ball rolling on those building blocks. + +00:09:36 That's what that course is about. + +00:09:37 That's also probably going to be the main bones of this topic as opposed to a dentist + +00:09:41 Bob and his ambitions to make a new tech startup. + +00:09:44 Exactly. + +00:09:45 We're going to grow new teeth if we just could get the software right. + +00:09:47 Now, yeah. + +00:09:48 So how do you do that? + +00:09:49 And we're going to talk about that before. + +00:09:51 I just want to give you a chance before we really dive too far into the details of how + +00:09:56 we make that work is just, you know, give people a sense of what you've been up to lately. + +00:10:00 You've done a lot of cool stuff with CalmCode.io. + +00:10:04 I can see your YouTube game is growing strong here. + +00:10:08 And you've been doing a lot of data science at Marimo. + +00:10:11 Yeah, what's... + +00:10:12 Right. + +00:10:12 Yeah, so we haven't spoken in a year, so maybe we should catch up. + +00:10:16 So long story short, CalmCode is still a project that I maintain. + +00:10:19 It's basically 99% free educational content on Python. + +00:10:24 that thing is just going to maintain itself. + +00:10:27 Super happy to keep maintaining it. + +00:10:29 The main thing I'm doing with that nowadays is just every two months I switch cloud providers + +00:10:32 just because I can, and then I can sort of see what it's like on the other side. + +00:10:36 So that's the thing that I do. + +00:10:37 From Calm Cult, though, I figured I might also start a YouTube channel, + +00:10:40 and that ended up being a YouTube channel where I still talk about Python stuff, + +00:10:43 but I mainly talk about these fancy ergonomic keyboards + +00:10:46 because I had a couple of RSI issues. + +00:10:47 In a year's time, I went from 100 subscribers to 5,000 something, + +00:10:51 so this thing kind of took off. + +00:10:53 That's awesome. + +00:10:53 I actually have sponsors now. + +00:10:55 So I actually have a couple of companies in Asia sending me their custom boards for me to review. + +00:11:00 So that's been a really fun journey. + +00:11:02 But since last time we spoke, I also switched employers. + +00:11:04 So people might have heard of Jupyter before. + +00:11:07 That's an environment in Python where you can do interactive things. + +00:11:10 And I work now for a company that makes Maremo, which is an alternative, a very likable one. + +00:11:15 It does work completely differently. + +00:11:17 One of the main things that attracts people to it is the fact that these kinds of notebooks are Python files under the hood. + +00:11:23 They're also a little bit more interactive. + +00:11:25 You can do more rapid prototyping with UI. + +00:11:27 You can blend Python with it very nicely. + +00:11:30 So I will say like all of my rapid prototyping, especially with LLMs, I do that in Marimo + +00:11:34 nowadays just so you can blend Python with UIs very easily. + +00:11:38 And there's demos on the site that people can go ahead and check out. + +00:11:41 But that's also the second YouTube channel I made this year. + +00:11:45 I do a lot of stuff for Marimo. + +00:11:47 I'm a little bit more on the growth side of things than the hardcore engineering side of things. + +00:11:51 I still make a lot of plugins for Marimo, of course, but that's a little bit more of what I do nowadays. + +00:11:56 Making sure that people learn about Marimo. + +00:12:00 You can do things in a notebook now that you couldn't think of before. + +00:12:03 I write my command line apps in a notebook nowadays because it's actually not just because I can, but because it's convenient too. + +00:12:09 So we'll talk a little bit about that later when we talk about LLMs, but that's effectively the short story of what I've been doing last year. + +00:12:14 Nice. All right. I am a fan of Marimo. + +00:12:18 Happy to hear it. + +00:12:19 Yeah, yeah, yeah. I did a data science course this year, just enough Python and software engineering for data scientists. Like, you know, what could you bring from the software engineering side to like sort of make your, your data engineering, data science stuff a little more reliable. But I was right on the fence of should I, should I use this? Because it's, I think it's clearly better. But at the same time, I also want to use what people are using. So I didn't quite go for it. But I just, the UI is fantastic. + +00:12:47 the reproducibility, reliability of it, where it uses the abstract syntax tree + +00:12:52 or concrete, whatever, to understand relationships across cells + +00:12:56 so they don't get out of sync, potentially. + +00:12:57 Yeah, I think it's really nice. + +00:12:59 I had Akshay on the show, who you work with also, + +00:13:02 to talk about it. + +00:13:03 The one thing in Jupyter that's, I mean, let me start by saying, + +00:13:08 Jupyter is still a really cool project. + +00:13:10 Like if I think back of the last decade, like all the good that that project + +00:13:13 has brought to my career, not just directly as a user, + +00:13:16 but also indirectly, all the algorithms that got invented + +00:13:19 simply because we had a good enough interactive environment suddenly, which we never + +00:13:23 had before, it's done wonders. + +00:13:25 So we should definitely be thankful. + +00:13:27 Yeah, I'm not bashing on it either. + +00:13:29 Yeah, but I do always want to make sure that I give credit where credit is due, because + +00:13:33 the project had a lot of impact. + +00:13:35 But there is this really annoying thing with Jupyter, though. + +00:13:37 Besides, you can be a bit grumpy about the file format, sure, but the one thing that's + +00:13:41 very annoying is you can write down X is equal to 10, and then + +00:13:45 have all sorts of cells below it that depend on X, You could then delete the cell and the notebook will not complain to you about it. + +00:13:51 Even though if anyone else tries to rerun the notebook, X is gone, it won't run, and your notebook is broken and you can't share the thing anymore. + +00:13:58 And nowadays, fast forward like many years later, and we've got stuff like uv now. + +00:14:02 So we can add metadata to a Python file to add dependencies. + +00:14:06 And oh, wait, because Marimo is a Python file, we can add dependencies to the notebook that makes it super reproducible. + +00:14:11 There's just all this stuff that you can rethink now simply because we have a good notebook format that is still a Python file. + +00:14:18 That's really the killer feature here. + +00:14:20 We can talk more about it if you like, but I can talk for ages about this topic, just warning you. + +00:14:25 One final thing, maybe also for the software engineering side of things, because you didn't mention it. + +00:14:29 Just to give an example of something we added recently. + +00:14:31 If you have a cell in Marimo and there's a function in there that starts with test underscore, we will automatically assume it's a pytest. + +00:14:39 So you can actually add unit tests to your notebook as well. + +00:14:41 And then if you say python notebook.py, then pytest will just run all the tests for you, + +00:14:47 even though you can also run the tests in line in your CI CD as well. + +00:14:49 So there's all sorts of these Python specific things that we can add, + +00:14:53 again, because it's just a Python file. + +00:14:54 Yeah, it's sort of the second mover advantage or nth mover advantage + +00:14:58 where n is greater than one, where you see, okay, that was awesome. + +00:15:01 Maybe this was a little bit of a rough edge. + +00:15:02 And what would we do to work around that or smooth it out, right? + +00:15:06 Well, and also we're also lucky that we're born in the age of uv. + +00:15:09 I got to say like a lot of like quality of life improvements to a notebook. + +00:15:15 A lot of that is also due to the fact that uv is around. + +00:15:17 That's definitely a huge help as well. + +00:15:20 This portion of Talk Python To Me is brought to you by Sentry's Seer. + +00:15:24 I'm excited to share a new tool from Sentry, Seer. + +00:15:27 Seer is your AI driven pair programmer that finds, diagnoses and fixes code issues in your Python app faster than ever. + +00:15:35 If you're already using Sentry, you are already using Sentry, right? + +00:15:39 Then using Seer is as simple as enabling a feature on your already existing project. + +00:15:45 Seer taps into all the rich context Sentry has about an error. + +00:15:48 Stack traces, logs, commit history, performance data, essentially everything. + +00:15:53 Then it employs its agentic AI code capabilities to figure out what is wrong. + +00:15:58 It's like having a senior developer pair programming with you on bug fixes. + +00:16:02 SEER then proposes a solution, generating a patch for your code and even opening a GitHub pull request. + +00:16:08 This leaves the developers in charge because it's up to them to actually approve the PR. + +00:16:13 But it can reduce the time from error detection to fix dramatically. + +00:16:18 Developers who've tried it found it can fix errors in one shot that would have taken them hours to debug. + +00:16:24 SEER boasts a 94.5% accuracy in identifying root causes. + +00:16:29 SEER also prioritizes actionable issues with an actionability score, so you know what to fix first. + +00:16:36 This transforms Sentry errors into actionable fixes, turning a pile of error reports into an ordered to-do list. + +00:16:44 If you could use an always-on-call AI agent to help track down errors and propose fixes before you even have time to read the notification, check out Sentry's SEER. + +00:16:54 Just visit talkpython.fm/SEER, S-E-E-R. + +00:16:58 The link is in your podcast player show notes. + +00:17:01 Be sure to use our code, Talk Python. + +00:17:03 One word, all caps. + +00:17:05 Thank you, Dysentry, for supporting Talk Pythonemy. + +00:17:08 I saw on, gosh, where was it? + +00:17:10 Speaking of Reddit, I saw on slash R, learn Python or slash R Python. + +00:17:16 One of the slash R's with a Python substring. + +00:17:19 Someone asks, what Python package manager would you use now? + +00:17:23 And it's just like, how many times can you say UV? + +00:17:27 The feedback comments. + +00:17:29 I mean, it was someone new who wasn't unsure, right? + +00:17:31 They've seen all the different ones. + +00:17:33 And yeah. + +00:17:34 I mean, the coolest comparison I've seen, I think it was a tweet someone posted. + +00:17:38 But like, definitely, like, suppose you're in the data field right now. + +00:17:41 Like, what are the tools 10 years ago? + +00:17:42 What are the tools now? + +00:17:43 It definitely does feel like, okay, before we had pip, now we have uv. + +00:17:47 Before we had Pandas, now we have Polars. + +00:17:50 Before we had Matplotlib, now we have Altair. + +00:17:52 And before we had Jupyter, now we've got Marino. + +00:17:53 You can kind of see a generational shift, not just on the notebook side of things, + +00:17:57 but like on the package manager, on the data frame library, + +00:18:00 we're all learning from the previous generation and it kind of feels like. + +00:18:03 Yeah, absolutely. + +00:18:04 It's an amazing time. + +00:18:06 Every single day in Python is just more exciting than the day before. + +00:18:09 Yeah. + +00:18:10 Although I should also mention, especially with the Polars + +00:18:12 thing, there's a fair bit of rust too. + +00:18:14 That also makes a difference. + +00:18:16 Yeah, yeah, of course. + +00:18:17 So before Flyby, I mean, you talked a bit about your YouTube + +00:18:22 channel, you review the economic keyboards. + +00:18:25 you have for those people who are just listening, not watching your background is super nice. + +00:18:30 Your bona fides are solid here, but you've got a bunch of different ones. And I personally also, + +00:18:35 I don't know, you might even just kick me out of the club if I should do that. + +00:18:38 That's the Microsoft Sculpt, right? + +00:18:40 Microsoft Sculpt ergonomic. And I like it so much because I can take this little bottom thing off + +00:18:44 and it's like razor thin if I could rotate and reverse and jam it in my backpack and take it + +00:18:49 with me. But I've also had RSI issues for 25 years or something. And it was really bad. And I switched + +00:18:55 I used to type them on like laptops and stuff. + +00:18:56 And I switched to just something like this and I could type 10 hours a day and + +00:19:01 there's no problem. Whereas before, if I, if I were to force to type full on, on a laptop, + +00:19:05 I would probably be like unable to work in a week. If not a week. + +00:19:09 So we had an offsite in San Francisco. So I was there like two weeks ago, + +00:19:13 a lot of fun, but like, I'm not going to bring like, + +00:19:16 airport security is going to look at this and wonder what, + +00:19:19 like what kind of weird alien device this is. Right. So I figured, okay, + +00:19:22 I'll leave those at home and just bring my normal laptop. And after a week, + +00:19:25 I'm feeling it in my wrist again. + +00:19:27 It's not good. + +00:19:27 Yeah, it's something that takes serious. + +00:19:29 Although I will also say like because the kid was sick and the wife was sick + +00:19:34 so I wasn't able to do a whole lot of exercise at home. + +00:19:37 Also do exercise. + +00:19:37 That also totally helps. + +00:19:39 If I had to like, sure, buy an ergonomic keyboard if you like. + +00:19:42 Programmatic keyboards, they're great. + +00:19:44 But if you're going to do that as an excuse not to do exercise, you're doing it wrong. + +00:19:47 That's the one thing I do want to add to that. + +00:19:48 There's lots of stretches you can do. + +00:19:50 Taking breaks matters. + +00:19:51 Exercise matters. + +00:19:53 But I think those keyboards are an essential feature. + +00:19:56 They do totally help. + +00:19:58 And also, sometimes it's not so much a healing measure. + +00:20:01 It's more of a preventative measure. + +00:20:02 Yes, 100%. + +00:20:04 And I would like to put this message out just to everyone listening who is young, absolutely indestructible. + +00:20:11 Please. + +00:20:12 I know they're a bit of a pain in the butt with the weird curviness. + +00:20:15 It is so worth it to just say, I've never had any RSI issues. + +00:20:21 I just use this weird keyboard and people make fun of me, + +00:20:23 but I don't care. + +00:20:23 You know, that is a better thing than like, I'm working on it. + +00:20:27 Yeah. + +00:20:27 Oh, my. + +00:20:28 It hurts. + +00:20:28 You know? + +00:20:29 Well, another thing is also, especially now we've got stuff like Claude, right? + +00:20:32 I just want-- if I can point out like one thing on that front. + +00:20:36 So these ergonomic keyboards, you can often program them as you see fit. + +00:20:39 So you can say, like, if I hit this key, it's K. + +00:20:42 But if I hold it down, it becomes command, or whatever you like. + +00:20:45 But it also means that you can map macros or shortcuts. + +00:20:47 So whenever I hit this button, an app called MacWhisperer boots + +00:20:51 That's going to record my voice. + +00:20:51 I love Mac Whisper. + +00:20:53 Yeah. + +00:20:53 And then there's alternative variants for it. + +00:20:55 I'm sure you've got stuff for Linux as well. + +00:20:56 But the main thing it does is just really good speech to text. + +00:20:59 And then whenever I'm done talking, it's just immediately going to paste whatever I'm in. + +00:21:04 So, oh, that's actually very nice, because now the keyboard shortcut is just me holding my thumb down + +00:21:08 instead of some sort of weird claw I got to eject to my keyboard. + +00:21:15 And suddenly, it doesn't necessarily become a convenience thing. + +00:21:17 It also becomes kind of a power user thing. + +00:21:20 Like you can really customize your computer experience + +00:21:22 if you can really map everything to your keyboard just the way that you like. + +00:21:25 It's like having Vim, but for all the apps out there. + +00:21:28 Okay, that's very neat. + +00:21:29 Yeah, I've remapped Caps Lock to my Activate Mac Whisper. + +00:21:34 So if I hit Caps Lock and hold this down, then I can dictate. + +00:21:36 And I know computers have had dictation for a long time, + +00:21:38 but it's been really bad, right? + +00:21:40 The built-in dictation to macOS or Windows isn't great, + +00:21:44 especially when you try to talk acronyms like, hey, do PyPI, and it's like... + +00:21:47 Yeah, Mac Whisper. + +00:21:48 No, but Mac Whisper, it's not like all the way there, + +00:21:51 but I will say it surprised me in quality a few times, definitely. + +00:21:55 Yeah, yeah. + +00:21:56 And that app, you can go in and actually set replacements. + +00:21:58 Like if you ever think you're going to write this, write that instead. + +00:22:02 So I've done that with like Talk Python because it always just turns it into a camel case combined. + +00:22:08 I'm like, no, those are two separate words. + +00:22:10 You know, whatever. + +00:22:10 You can sort of customize it a bit. + +00:22:12 Yeah, and I think my favorite one, I think Mac Whisper actually gets this one wrong still, + +00:22:17 But whenever I write scikit-learn, which is a very popular data science package, it always + +00:22:22 translates it to psychologists learn. + +00:22:26 Which, you know. + +00:22:30 I never want you to ever write this series of words. + +00:22:33 Because if I have to, I'll just type it all in time. + +00:22:36 It's not my common thing. + +00:22:39 It's one of these moments where actually I type scikit-learn often enough where it's + +00:22:43 like almost becoming the issue. + +00:22:44 So I'm like at the verge of adding these rules manually + +00:22:47 for all these weird open source packages that I interact with. + +00:22:49 But yeah, it's take ergonomics serious people. + +00:22:52 That's the one thing I wanna say. + +00:22:53 You don't always have to buy a super expensive keyboard. + +00:22:56 The, if you wanna explore like programmatic keyboards + +00:22:59 because you can customize things, that's an excellent reason. + +00:23:01 But like take a break and do exercises and just, you know, be healthy. + +00:23:05 That's the, that's the, that's how you win a marathon. + +00:23:07 - Yes, that's for sure. + +00:23:08 Now this is not the Vincent's keyboard review marathon, + +00:23:12 But let's wrap it up with, if you could take any keyboard + +00:23:17 hanging around on your wall there, which one would you use? + +00:23:19 - Well, so there's four pairs of these, so it's probably this one. + +00:23:25 So there's a couple of other boards that are great too. + +00:23:27 This board is not the board I would recommend to everyone, + +00:23:29 but if you have serious RSI issues, I do think the GloVe 80 is your best bet. + +00:23:34 It's simple, in terms of the shape, it is probably the most ergonomic shape for most hand sizes. + +00:23:39 - Yeah, okay, awesome. + +00:23:40 All right, well, let's switch over to talking about programming. + +00:23:45 Yes. + +00:23:45 LLMs with LLMs, not programming LLMs. + +00:23:48 And I guess, like you already called out, you wrote this course called + +00:23:52 LLM building blocks for Python. + +00:23:54 Super fun course, really neat, it's pretty short and concise. + +00:23:57 And it really talks about how can you reliably add some kind of LLM into your code. + +00:24:04 I guess what you're talking about in this course really applies regardless of + +00:24:08 whether it's a self-hosted one or it's open AI or Anthropic, right? + +00:24:12 You can have, there's some, some choices you can make on which LLM to use, right? + +00:24:17 So the, like the main idea I had with the course was like, + +00:24:22 an LLM is a building block at some point, but it's very unlike a normal building block when you're dealing with code, + +00:24:27 because normally with code, you put something into a function and like one thing comes out. + +00:24:32 But in this particular case, you put a thing into a function, + +00:24:34 you have no idea upfront what's going to come out. + +00:24:36 And not only that, but you put the same thing in twice + +00:24:38 and something else might come out as well. + +00:24:40 So that means that you're going to want to think about this tool + +00:24:44 a bit more defensively. + +00:24:45 It's a weird building block. + +00:24:46 It's like you have to put a moat around it because otherwise the building block is going to do stuff + +00:24:49 you don't want it to do, almost. + +00:24:52 And some of that is syntax. + +00:24:53 Some of that is how do you think about these Lego bricks. + +00:24:56 And some of it is also just what is good methodology in general + +00:24:59 to statistically test if the thing is doing roughly what you want it to do. + +00:25:03 Nice. + +00:25:04 That's the gist of it. + +00:25:06 Yeah, cool. + +00:25:07 Yeah, I pulled out a couple of ideas, concepts, and tools + +00:25:10 that you talked about throughout the course. + +00:25:12 And you don't have to have taken the course to have these things be useful. + +00:25:15 I just thought it might be fun to riff on some of the things + +00:25:18 you touched on here. + +00:25:20 The main thing I think will be fun is-- + +00:25:22 it's been half a year, I think. + +00:25:23 It will be fun how much of it is still intact. + +00:25:25 I think most of it still definitely is. + +00:25:27 But it might be fun to see if we can find anything that might be dated, just to see if the world has moved on + +00:25:33 quickly. + +00:25:33 I think there's only one thing, but I'm just kind of curious. + +00:25:36 Let's see. + +00:25:37 Yeah, all right. + +00:25:37 Well, let's keep our radar up for that. + +00:25:40 It's definitely something that's more changing quicker + +00:25:43 and has a higher likelihood of being dated. + +00:25:46 But I think it holds up pretty well. + +00:25:47 Yeah, okay. + +00:25:49 One of the things I remember emphasizing is you want to do some stuff like caching. + +00:25:53 So let's say you've got a function and you use an LLM for it. + +00:25:56 And let's keep it simple. + +00:25:58 Let's say we're just making summaries. + +00:25:59 So talk Python episode paragraph goes in, single sentence is supposed to come out. + +00:26:04 Something like that. + +00:26:05 Okay, well, you might have a loop and you're going to do maybe one pass, try one LLM with + +00:26:13 one type of setting, try another LLM with different type of settings to generate all + +00:26:16 this data. + +00:26:17 It would be a shame that you're going to use an LLM, which is kind of an expensive compute + +00:26:22 thing, if you put the same input in by accident and then you incur the cost twice. + +00:26:26 That would really stink. + +00:26:27 So one of the things you always want to do is think a little bit about caching. + +00:26:31 And there's a Python library called disk cache that I've always loved to use. + +00:26:34 And I highly recommend people have a look at it. + +00:26:36 I think Michael, you've also used it in one of your courses before. + +00:26:39 The trick is we have to talk about this. + +00:26:40 It's so good. + +00:26:41 It is so good. + +00:26:42 It is SQLite and is so good. + +00:26:45 It is even better than SQLite. + +00:26:47 It is unbelievably good. + +00:26:48 And I have you to think I knew about it, but yeah, whatever. + +00:26:52 And then after I saw you use it, I'm like, genius. + +00:26:54 It is genius. + +00:26:56 Yes. + +00:26:56 No, so it's like having the Eliru disk cache. + +00:27:00 But it's also on disk. + +00:27:01 So if you were to restart Python, you still have everything in cache basically. + +00:27:05 And it's a SQLite database, so you can always inspect all the stuff that's in there. + +00:27:10 If you wanted to, you can also do fancy things like add time to live to every single object. + +00:27:16 And this is something you could do in a Docker container for a web app. + +00:27:19 But the main thing that's always nice when you're dealing with LLMs is you always want + +00:27:23 to be able to say in hindsight, like, okay, how did this LLM compare to that one? + +00:27:27 You want to compare outputs. + +00:27:28 then just writing a little bit of a decorator on top of a function + +00:27:32 is the way to put it in SQLite, and you're just done with that concern. + +00:27:35 That is just amazing. + +00:27:37 And we're using this cache directly. + +00:27:40 If you're using LLM by Simon Willison from the command line, + +00:27:43 there's also a mechanism there so that you can get it into SQLite + +00:27:46 if you wanted to. + +00:27:46 So that's also a feature you could consider. + +00:27:49 But-- + +00:27:49 MARK MANDEL: Of course, he's going to put something in SQLite. + +00:27:51 Like, he can write a library that doesn't put something in SQLite, + +00:27:53 given his data set project. + +00:27:55 MARK MANDEL: It's Simon Willison. + +00:27:56 He'll put SQLite in SQLite if it's a Sunday. + +00:27:58 Yeah. + +00:28:01 But if you haven't used Discash before, it definitely feels like one of these libraries + +00:28:06 that because I have it in my back pocket, it just feels like I can tackle more problems. + +00:28:10 That's the short story of it. + +00:28:11 And again, the course uses it in a very sort of basic fashion, + +00:28:15 but knowing that everything you do in an LLM only needs to happen once, + +00:28:20 if you're interested in using it once, that just saves you so much money. + +00:28:24 Yeah, and it's so useful. + +00:28:25 So one of the things you did in the course is you said, + +00:28:27 All right, the key for the value that we're going to store is going to be the model, the model settings and the prompt as a tuple, right? + +00:28:35 Something along those lines. + +00:28:36 And then you use that as the key. + +00:28:38 So if any of those variations change, does the model change? + +00:28:41 Do the settings change? + +00:28:42 Or like anything, that's a totally different request. + +00:28:45 And then you just store the response. + +00:28:46 And then boom, if you ask the exact same question of the same model with the same prompt with the same settings, why do you need to go and wait 10 seconds and burn money and environmental badness? + +00:28:56 when you could literally within a microsecond get the answer back. + +00:29:00 Yeah, and there's also a fancy thing. + +00:29:01 It's a trick you can do on top. + +00:29:02 So sometimes you want to say, well, there's a statistical thing also happening. + +00:29:06 So sometimes I want to have one input and actually store maybe 10 outputs + +00:29:10 so I can look at all these outputs, maybe annotate that later. + +00:29:13 And the way you solve that is you just add an integer to the tuple, basically. + +00:29:17 And then you're also able to store many outputs if you really like. + +00:29:20 The sky's the limit. + +00:29:21 The fourth one, the eighth one, whatever. + +00:29:23 Yeah, that's flexible. + +00:29:25 Yeah. + +00:29:26 And it does a whole bunch of neat, neat things that are really, really wild. + +00:29:29 Like, so it looks like just, Hey, I put this value into it. + +00:29:32 Right. + +00:29:32 And it stores it. + +00:29:33 It's really powerful because across application executions. + +00:29:37 So like maybe if you're caching that response in your notebook and what are + +00:29:40 you doing some testing, if you come back later, you've start the notebook back up + +00:29:44 or restart the kernel or whatever, like it's not like an LRU cache. + +00:29:47 It remembers cause it's stored somewhere in like temporary storage in a local + +00:29:51 SQLite file, which is amazing. + +00:29:53 It also has interesting ideas. + +00:29:56 I'm not sure really where they are, but it has different kinds of caches as well. + +00:30:00 So maybe you're storing a ton of stuff in there. + +00:30:03 And so it'll do basically built-in sharding. + +00:30:06 Oh, yeah. + +00:30:06 Multiple SQLite files. + +00:30:08 And it's really, really good. + +00:30:10 This is a deep library. + +00:30:11 This is not just, oh, yeah. + +00:30:13 It's like LRU, but to desk. + +00:30:15 So the cool thing about that library is it really does go deep. + +00:30:18 But if you really just want to use it as a dictionary, you can. + +00:30:22 That's the thing that I really love about it. + +00:30:23 But for all intents and purposes, you just treat it as a dictionary, and you're good. + +00:30:26 Or use it as a decorator on a function, and again, you're good. + +00:30:31 So again, it's one of those little hacks where, oh, if you just know about the library, + +00:30:36 you just become more productive at typical Python stuff. + +00:30:39 Yeah. + +00:30:39 I'll give you a place where I'm using it, actually. + +00:30:41 I use it on some LLM stuff, like where I'm programming against an LLM, for exactly the + +00:30:46 reason you did. + +00:30:46 Because if you have the same inputs, don't ask the question again. + +00:30:49 You just hear the answer. + +00:30:50 It's real, real fast. + +00:30:51 But if you go over to like Talk Python, let's see here. + +00:30:55 Go over to the guests, for example, right? + +00:30:56 So there's a bunch of guests. + +00:30:58 And here we have Vincent. + +00:30:59 Not that Vincent. + +00:31:00 Not that Vincent. + +00:31:01 There you are, that Vincent. + +00:31:01 All these Vincents. + +00:31:04 They're all over the... + +00:31:05 I have like 560 guests or something. + +00:31:07 There's a lot. + +00:31:08 But in here, you'll notice that if you go into the view of the source on this thing, + +00:31:13 like all over the place, anytime there's a picture, it'll have this little cache-busting ID on the end. + +00:31:20 And that's fine when it's served locally because you can just look at the file + +00:31:23 and go just the file still with the same cache at startup. + +00:31:25 But if it comes from like an S3 blob storage, you know, and the app restarts, + +00:31:30 how do I know what that is? + +00:31:31 Like it has to go, it would have to re-download the entire content of the blob. + +00:31:37 So check this out. + +00:31:39 Yeah. + +00:31:39 So I just added. + +00:31:40 Yeah, yeah, yeah. + +00:31:41 So this feels like there's something between the proper CDN + +00:31:44 and then getting it from S3. + +00:31:45 Like there are these moments when you want to have something that's kind of in between. + +00:31:48 And then disk cache could actually be a good solution for it. + +00:31:51 Exactly. + +00:31:52 So what the site does is it has a disk cache. + +00:31:54 And anytime it says, hey, I want to refer to a resource that's external, + +00:31:59 it'll download it once, compute the hash, and then store it in the disk cache unless you change something behind it. + +00:32:05 And so it's automatically using this, and it makes everything like, + +00:32:09 there's never stale resources, and it's instantly fast, + +00:32:12 even if they're served out of something remote like S3 equivalent. + +00:32:15 And do you also do a time to live? + +00:32:19 Is it also the time to live is also like every day it's allowed to refresh once or something like that, I suppose? + +00:32:24 Yeah, for the S3 stuff, I don't because I've set up all the admin functions that if I ever change one through the admin, it deletes it out of the cache. + +00:32:32 Gotcha. + +00:32:33 So it's like it's internally consistent. + +00:32:35 But for like other things, like if it parses something out of the, say, the description, which is set in the dictionary, that stuff, it's just got a time to live of like a day or something. + +00:32:46 And it's got like, there's a bunch of those. + +00:32:47 I'm using all these different places. + +00:32:49 And wow, it's so good. + +00:32:50 I just wanted to say thank you, because I knew we were going to talk about it today, + +00:32:53 because this is so good. + +00:32:54 Yeah, I know. + +00:32:57 If it was part of the standard library, I would honestly not be surprised. + +00:33:01 That's also the story with that. + +00:33:03 But yeah, SQLite is great. + +00:33:04 DiskCache is great. + +00:33:06 It feels like a dictionary, but it gives you so much more. + +00:33:08 It's great. + +00:33:09 That's the only thing I can say about it. + +00:33:10 So people are probably like, wait, I thought we were talking about LLMs. + +00:33:14 Yeah, we are. + +00:33:14 I think this is one of the interesting things, It's because there's all these interesting tools and it's not even about using agentic AI. + +00:33:20 And it's not like there's really cool libraries and tools that you can just apply to this LLM problem, but also apply everywhere else. + +00:33:28 Right. + +00:33:29 It's one of those. + +00:33:30 So it's one thing I have found with like LLMs in general if you're building software with it. + +00:33:34 On the one end, I think it can be very helpful if you're like a proper like senior engineer kind of a person because then you know about things like, oh, I want a cache. + +00:33:40 And what's the pragmatic cache? + +00:33:41 And you can also pick Redis, by the way. + +00:33:43 If you would have used Redis for this, that could have also worked. + +00:33:45 Just that I think-- + +00:33:45 - Sure, it would have been fine. + +00:33:46 - Yeah, it would have been fine. + +00:33:47 - But you know what I like is there's no servers. + +00:33:49 I don't have to deal with the servers. + +00:33:51 Like, it's just, it's a file. + +00:33:53 - It's very true. + +00:33:54 - And it just keeps it simpler. + +00:33:55 - It's totally true. + +00:33:56 But the point I wanted to get out here, like your previous experience as a proper engineer + +00:34:01 will still help you write good LLM software. + +00:34:03 However, from a similar perspective, I also think that we do have this generation + +00:34:07 of like data scientists, and maybe data engineer, data analyst kind of person, + +00:34:11 like thinking analytically being quite critical of the output of an algorithm. + +00:34:15 That's also like a good bone to have in this day and age, I would say. + +00:34:20 Because if I've built a couple of recommenders back in my day, like it's been a decade now, + +00:34:24 but like one of the things you do learn when you're building a recommender is that you're stuck with this problem of, + +00:34:29 hey, I gave this user this recommendation and they clicked it. + +00:34:32 Would they have clicked this other thing if I would have recommended it to them? + +00:34:36 And suddenly you're dealing with a system that's like really stochastic and hard to predict. + +00:34:40 and you have to be kind of strict about the way you test and compare these different algorithms. + +00:34:44 And you want to think twice about the way you A-B test these things. + +00:34:47 And, oh, actually, just like disk cache is useful as a tool, + +00:34:50 having a little bit of methodology statistically in your mind will also help you + +00:34:55 because comparing LLMs, a lot of it is doing evaluations, being kind of strict about that. + +00:35:01 And that's also what I try to do in the course. + +00:35:03 I try to just show you that if you're really strict about evaluations, + +00:35:06 then you can also learn that for some problems, you're still better off using scikit-learn + +00:35:09 because you just evaluate it and then you learn that like the number is better on the side could + +00:35:13 learn side of things yeah that's when you feel like oh maybe i did it wrong where you paid + +00:35:17 a bunch of money to run expensive slow lms and you're like i would just use well so it's funny + +00:35:22 you say that so i'm i've actually been talking to a bunch of people that do lms at companies here in + +00:35:26 the netherlands and you know you you go to a pi data you go to a conference and you give them just + +00:35:31 enough beer so they're honest in that kind of a situation the nda curtain opens just a little or + +00:35:37 or whatever. + +00:35:38 Plus, more deniability is the name of the game in this one, yes. + +00:35:42 But then the stories you hear, which I did find somewhat encouraging, + +00:35:45 is they do all kind of go, well, there's budget now to do AI stuff, + +00:35:49 which means that we try out all the different LLMs, + +00:35:51 and we really can invest in evals and that sort of thing. + +00:35:54 And funnily enough, we also put some base models in there, + +00:35:57 like as a standard benchmark that we should beat. + +00:35:59 And I've heard a story a bunch of times now that because of the hype around LLMs and AI, + +00:36:05 after it was implemented, after they did all the benchmarks, + +00:36:07 It turns out that AI is the reason that scikit-learn is now in production in a bunch of places. + +00:36:10 And it's also the same thing with like spaCy. + +00:36:13 Because what a lot of people do learn is that, hey, if the spaCy model or like the lightweight model, so to say, + +00:36:19 is like somewhat equivalent to the LLM model after you give it like enough training data, + +00:36:24 which you do want to have either anyway, because you need to need that for evaluations. + +00:36:27 Well, typically those models are more lightweight and they will always produce the same output. + +00:36:32 Same thing goes in, same prediction will always come out. + +00:36:34 And you can really fence it off, have it on like a normal VM. + +00:36:37 That's also totally fine. + +00:36:39 Oh, and you know, another benefit, it's super lightweight to run. + +00:36:42 You just need a Lambda function and you're good, as opposed to like a GPU or like a huge cloud bill or something like that. + +00:36:48 So some of the stories that I'm hearing do suggest that, + +00:36:52 okay, these LLMs are also helping out like standard data science work, if it were, + +00:36:57 if only because management now really does want to be serious about the investment + +00:37:00 and really wants to do the experiments. + +00:37:03 portion of Talk Python To Me is brought to you by NordStellar. NordStellar is a threat exposure + +00:37:07 management platform from the Nord security family, the folks behind NordVPN that combines dark web + +00:37:13 intelligence, session hijacking prevention, brand abuse detection, and external attack service + +00:37:19 management. Keeping your team and your company secure is a daunting challenge. That's why you + +00:37:24 need NordStellar on your side. It's a comprehensive set of services, monitoring, and alerts to limit + +00:37:30 your exposure to breaches and attacks and act instantly if something does happen. Here's how + +00:37:35 it works. Nordstellar detects compromised employee and consumer credentials. It detects stolen + +00:37:42 authentication cookies found in InfoStealer logs and dark web sources and flags compromised devices, + +00:37:48 reducing MFA bypass ATOs without extra code in your app. Nordstellar scans the dark web for + +00:37:54 cyber threats targeting your company. It monitors forums, markets, ransomware blogs, and over 25,000 + +00:38:01 cybercrime telegram channels with alerting and searchable contexts you can route to Slack or your + +00:38:07 IRR tool. Nordstellar adds brand and domain protection. It detects cyber squats and lookalikes + +00:38:13 via visual, content similarity, and search transparency logs, plus broader brand abuse + +00:38:19 takedowns across the web, social, and app stores to cut the phishing risk for your users. + +00:38:24 They don't just alert you about impersonation, they file and manage the removals. + +00:38:29 Finally, Nordstellar is developer-friendly. + +00:38:31 It's available as a platform and an API. + +00:38:35 No agents to install. + +00:38:36 If security is important to you and your organization, check out Nordstellar. + +00:38:40 Visit talkpython.fm/nordstellar. + +00:38:42 The link is in your podcast player's show notes and on the episode page. + +00:38:45 Please use our link, talkpython.fm/nordstellar, so that they know that you heard about their service from us. + +00:38:52 And you know what time of year it is. + +00:38:53 It's late fall. + +00:38:55 That means Black Friday is in play as well. + +00:38:57 So the folks at Nord Stellar gave us a coupon, BlackFriday20. + +00:39:02 That's BlackFriday, all one word, all caps, 20, two, zero. + +00:39:05 That grants you 20% off. + +00:39:07 So if you're going to sign up for them soon, go ahead and use BlackFriday20 as a code, + +00:39:11 and you might as well save 20%. + +00:39:13 It's good until December 10th, 2025. + +00:39:16 Thank you to the whole Nord security team for supporting Talk Python To Me. + +00:39:21 You could say, we're going to use your mandate is to add AI. + +00:39:24 And then you go user, well, we got to see how well it's working. + +00:39:26 So we tested it. + +00:39:27 And then we have to compare it to something of a benchmark. + +00:39:30 Exactly. + +00:39:32 The benchmark is fast and effectively free and about as good. + +00:39:36 So we're good. + +00:39:38 We're just going to do that. + +00:39:39 You know, I think it's fine. + +00:39:41 Some organizations need like mandate from above in order to get something done. + +00:39:45 And this LLM craze, if nothing else, does seem to have caused the mandate from above. + +00:39:50 I'm sure it is. + +00:39:52 So that's something I would keep in the back of your mind as well, dear listener. + +00:39:56 Like sometimes that mandate can also give you permission to do other things. + +00:40:00 Yeah. + +00:40:01 Not working for a large enterprise type company for quite a while. + +00:40:05 I imagine I'm pretty blind to how that is transforming directives from the top. + +00:40:11 But I'm sure bosses are like, I'm using ChatGPT and it's so much better than our software. + +00:40:15 What are we going to do about that? + +00:40:16 you know yeah it's uh well i mean i'm in the developer tooling space so i still talk to big + +00:40:21 companies and small companies as well you do notice that big companies they work differently + +00:40:25 than small companies that is certainly true um for better or worse like there's also good there's + +00:40:28 also good reasons why bigger companies like if i'm a bank like uh i'm pretty sure it's a good idea to + +00:40:33 also have some rules that i have to abide by which yeah i'm just thinking more of like how disconnected + +00:40:38 is the higher level management from the product and like how much do they just like dictate a thing + +00:40:43 Okay, so let's talk about some of the tools. One of the things you primarily focused on was using + +00:40:50 Simon Willison's LLM library. Tell us about this thing. So the funny thing about that library is + +00:40:56 that it's actually kind of more meant as a command line utility. I think that's kind of the entry + +00:41:00 point that he made. But then he also just made sure there was some sort of a Python API around it. + +00:41:06 And after looking at that library, and also after playing around with all these other libraries, + +00:41:11 and I'm about to say this, but this is a compliment. + +00:41:13 I just found the LLM library by Simon Willison by far to be the most boring. + +00:41:18 And I mean that really in a good way, just unsurprising, + +00:41:22 only does a few things. + +00:41:23 The few things that it does is just in a very predictable way. + +00:41:26 And especially if you're doing rapid prototyping, what I felt was just kind of nice is that + +00:41:30 it does feel like you have building blocks that allow you to build very quickly, + +00:41:33 but it doesn't necessarily feel like you're dealing with abstractions + +00:41:35 that can sort of wall you in at some point. + +00:41:38 So for a hello world getting started, just do some things in a very Pythonic way, + +00:41:43 this boring library really did the trick. + +00:41:46 And I'm also happy to report it's still a library that I use. + +00:41:49 It's still definitely a library in my tool, though. + +00:41:51 Whenever I want to do something real quick, that is definitely a tool that I refer to. + +00:41:56 And one of the things-- + +00:41:57 Is it kind of an abstraction across the different LLMs + +00:42:01 and that kind of stuff? + +00:42:02 If I want to talk to Anthropic versus OpenAI, that doesn't matter? + +00:42:06 So the way that this is built is also, I think, with good taste. + +00:42:09 So what you don't want to do as a library is say, like, I'm going to basically support every library under the sun. + +00:42:15 Because as a single maintainer in the case of Simon Willis, you're just going to drown in all these different providers. + +00:42:20 So what he went for instead was a plugin ecosystem. + +00:42:23 Now, the downside of a plugin ecosystem is that then you defer the responsibility of maintaining a plugin for a specific source to another maintainer. + +00:42:30 But you might get someone who works at Mistral to make the Mistral plugin. + +00:42:34 And you might get someone who works at Open Router to make the Open Router plugin, etc. + +00:42:38 So you do distribute the workload in kind of a nice way. + +00:42:42 And all the bigger models are definitely supported. + +00:42:45 So the Anthropic and the OpenAI ones, those are just always in there. + +00:42:50 But you will also definitely find some of the more exotic ones + +00:42:54 that they will also just have a plug in themselves. + +00:42:56 And one thing that also helps under the hood is that OpenAI has a standard under the hood. + +00:43:01 Their SDK has become a bit of a standard across industry. + +00:43:04 So you can also reuse the OpenAI stuff. + +00:43:07 It would not surprise me at all that if you were just + +00:43:09 to change a URL in the setting, that you can also connect to Mistral via the OpenAI objects. + +00:43:15 I would have to double check, but that wouldn't surprise me. + +00:43:17 Yeah, that is a very common way of just like, you know what, we're all going to just adopt Open. + +00:43:23 It's a little like S3. + +00:43:24 Like when I was saying S3 earlier, I was actually talking to DigitalOcean. + +00:43:28 But it doesn't matter. + +00:43:29 I'm just still using Boto3 to talk to it. + +00:43:31 Yeah, it does feel weird like, oh, you want to use DigitalOcean? + +00:43:34 you have to download a SDK from a competing cloud provider and then you can. + +00:43:39 Exactly. It's so weird. + +00:43:41 But I mean, the thing I do find that it's just a little bit funny here is like technically, + +00:43:45 this thing is meant to be used from the command line. It just happens that it's written in Python + +00:43:49 and it just happens that also it has a decent Python API. And that's the thing I actually end up + +00:43:54 using more than stuff from the command line. That's because I do a lot of things in notebooks. So the + +00:43:58 stuff that I tend to use is a little bit more in the Python side. + +00:44:01 Sure, of course. Same here. + +00:44:02 Yeah, I don't think I would use it very much on the command line. + +00:44:05 But, you know, I can see using it as a step in a series of things happening on the command line, + +00:44:11 like some sort of orchestration, like X, Y, ask the LLM, Z, you know? + +00:44:16 Well, the main thing I actually use it for from the command line is you can do a git diff + +00:44:21 on whatever you're working on and then sort of pipe that through to the LLM to make the commit message. + +00:44:26 Like there are like these little moments like that where you do want to have a little automation, + +00:44:29 maybe in CI and like being able to do this from a command line is definitely useful it's just not + +00:44:34 the thing i use it for most or like if i use it that way it's a thing i automate once and then + +00:44:38 never really look at it but the interaction really for me is more in the notebook oh yeah that's a + +00:44:44 very cool librarian i definitely need for um like one thing that might be fun to point out because + +00:44:49 that's something i built i think uh around the same time like if you were to type into google + +00:44:54 uh smart funk and then my github alias koning a k-o-a-n-i-n-g um because it's a little + +00:45:02 okay you already had it open good man i'm really good with google actually i use a start page these + +00:45:07 days but i'm sure i could find it there oh okay here you're well you have an ergonomic keyboard + +00:45:11 so there you go quick typing um already yeah no so like a thing i was able to build on top of uh + +00:45:16 the thing that simon willison made and it's something that appears in the course as well + +00:45:24 that. Yeah, there you go. So basically what this function does is you can add a decorator. + +00:45:30 You just have a normal Python function. The inputs, you just put types on it. So you can say like, + +00:45:35 this input is a string, this input is an integer, or what have you. You then add a decorator that + +00:45:40 says what backend you want to use. So GPD4 or anything that Simon Willis and supports. And then + +00:45:45 the doc string is the prompt you give to the LLM. So you can do something like, hey, I have a function + +00:45:51 that accepts a term called paragraph. + +00:45:53 That's a string. + +00:45:54 And then the doc string of the function says, summarize this. + +00:45:56 And then lo and behold, you now have a Python function + +00:45:59 that can do summaries. + +00:46:00 I see. + +00:46:01 And the doc string can be effectively a Jinja template. + +00:46:05 Yes. + +00:46:05 If you wanted to, it could be a Jinja template. + +00:46:08 Alternatively, what you can also do is you can also say, well, what the function returns + +00:46:12 is the prompt. + +00:46:13 That's also something you can do. + +00:46:15 Got it. + +00:46:16 But this is just a very quick hack for some of the stuff + +00:46:19 that I want to do real quick. + +00:46:20 I don't necessarily recommend other people to use this, + +00:46:23 but this is something you can build on top of the LLM library + +00:46:25 from Simon Willison. + +00:46:26 So if you want to make your own syntactic sugar, your own abstractions, it's definitely something + +00:46:29 that you can also do. + +00:46:31 And also, again, the name of the game here is quick iteration. + +00:46:34 So also feel free to think of your own tools now and again. + +00:46:37 But you want the base layer to be as boring as possible. + +00:46:39 And Simon Willison's library, again, does that in a good way. + +00:46:42 Right, right. + +00:46:43 Just don't make me think about the details. + +00:46:45 Yes. + +00:46:46 Let me swap out the model providers, which is actually + +00:46:48 doing quite a bit, but not conceptually. + +00:46:51 Yeah. + +00:46:52 So again, one thing I do think at this point in time, + +00:46:56 if you've not played around with this software stuff already, + +00:46:59 I do recommend maybe doing that sooner rather than later. + +00:47:02 And I think Simon Willison's approach, if you're already a Python person, + +00:47:04 it's probably the simplest way to get started. + +00:47:06 But I do want to warn people in the sense of these are just tools. + +00:47:10 Tools are useful. + +00:47:11 Tools are great. + +00:47:13 But at some point, it's not so much about the tools that you use, + +00:47:15 but more about how you think about the tools, Like the way you interact with these tools, + +00:47:19 is it not some, like you can use a hammer in many different ways, + +00:47:22 but the way you think about a hammer is like also equally important. + +00:47:25 - Yeah, 100%. + +00:47:26 So speaking of thinking about how this goes, one of the challenges is, + +00:47:30 so let's just take this example here. + +00:47:33 It says, the example on your smartphone, GitHub readme, it says generate summary, + +00:47:38 take some text, it says generate a summary, the following text, colon text. + +00:47:42 - Yes, text. + +00:47:43 - Super straightforward. + +00:47:44 But what if you, what if you said, change the prompt a little bit said, give me all of the + +00:47:48 Python libraries discussed in the following markdown content. + +00:47:52 Right. + +00:47:53 And do you want a novella back, like a novel back when, when really what you want is like, + +00:47:59 I want a list of strings and URLs or whatever. + +00:48:03 Like how do I get programmability instead of just chatability, I guess. + +00:48:07 Yeah. + +00:48:08 So the thing that Simon Willis's library does allow you to do, is you are able to + +00:48:13 say, well, I have a prompt over here, but the output that I'm supposed to get out, well, that has to be + +00:48:18 a JSON object of the following type. And then you can use Pydantic. So you can do something like, well, + +00:48:22 I expect a list of strings to come out over here in the case that you're trying to detect Python + +00:48:25 libraries or Pokemon names or what have you. Or you can say, well, there's more like a classification. + +00:48:31 I believe you can pass a literal with like only these values and that's an also constraint. And + +00:48:35 the reason that you can do that is some of these LLMs are actually trained on specific tasks. One of + +00:48:41 these tasks could be tool calling. Another one of these tasks that are trained for these days is + +00:48:45 that you have to say the right word here, something, something output, structured output, + +00:48:51 so that the LLM can sort of give a guarantee that if you declare that you get a list of strings out, + +00:48:56 that you actually do get a list of strings out. And that's something that these things are + +00:48:59 typically trained for. So that's something you can do. If you're using open source models, + +00:49:05 definitely check them out. They're cool. But quality does vary, is what I will say. + +00:49:10 So that's something they always keep in the back of your mind. + +00:49:12 And also the more complex you make your PyDonic objects, + +00:49:15 like if you make like a nested thing of a nested thing of a nested thing, + +00:49:18 at some point the LLM is going to have a bit of trouble with that. + +00:49:21 Even though PyDonic is of course great, but like you make it harder for the LLM to do the right thing at some point. + +00:49:27 But that's also a thing that definitely you want to use + +00:49:30 if you're doing stuff with software. + +00:49:32 Like the really cool thing about PyDonic is you can say, + +00:49:34 I expect something that looks like this to come out + +00:49:36 and you can force an LLM to also actually do that. + +00:49:39 Like, you will get the right types that come out. + +00:49:41 The contents might still be maybe a bit spooky. + +00:49:43 But at least you get the right types out, which makes it a lot more convenient to write software. + +00:49:47 You can say, I can loop over the things that sent back, + +00:49:49 and you're not into vague text parsing land. + +00:49:53 Well, the main thing you get is-- + +00:49:55 I remember when the first OpenAI models sort of started coming + +00:49:58 out, I worked back at Explosion. + +00:50:00 We made this tool called spaCy there. + +00:50:02 And one of the things that we were really bummed out about + +00:50:04 was LLMs can only really produce text. + +00:50:06 So this structural, like if you want to detect a substring in a text, for example, + +00:50:11 which is the thing that spaCy is really good at, then you really want to guarantee, well, I'm going to select a substring that actually did appear in the text. + +00:50:18 I want to know exactly where it starts and exactly where it ends, and that's the span that I want to select. + +00:50:22 So normally in NLP, text comes in and structured information comes out. + +00:50:27 But here come these LLMs and text comes in and text comes out. + +00:50:30 And then you've got to figure out a way to guarantee that structured output actually comes out. + +00:50:34 So that's the thing that the LN providers also acknowledge. + +00:50:36 So that's something that they're actually training these models for. + +00:50:39 But I think it's a lot better these days, but it was definitely a huge part of the problem + +00:50:45 like three to four years ago when these things just happened. + +00:50:47 It was super annoying. + +00:50:48 So if you use Pydantic, you define a Pydantic model that drives from Pydantic.basedModel. + +00:50:54 You put the types, type information, fields, et cetera. + +00:50:58 And then what does it do? + +00:50:59 Does it say, kind of use the schema generation that Pydantic already has? + +00:51:04 Like people are familiar with FastAPI. + +00:51:06 You set the response model, it'll do open API spec, which I believe just comes more or less from Pydantic, right? + +00:51:12 I could be. + +00:51:13 I think it's a JSON standard in the end. + +00:51:16 Like, you know how you can generate JSON schemas as well? + +00:51:18 I think it's that spec that is using under the hood + +00:51:21 that is then also giving off to OpenAI or what have you. + +00:51:24 So I think in the end, it's a JSON spec, but I could be wrong. + +00:51:27 But it's driven by the Pydantic model. + +00:51:29 And then that's given as part of the prompt. + +00:51:31 Like, here's my question. + +00:51:32 your answer will look like this. + +00:51:34 And it uses the Pydanic, something to that effect. + +00:51:36 Yeah, there's one small caveat there in the sense that you can also add custom types + +00:51:42 with Pydanic. + +00:51:42 So you can do something like, well, this is a non-negative integer, for example. + +00:51:46 And you can definitely imagine that, especially with these validations + +00:51:49 that you can write in Python, not everything that you can come up with there + +00:51:52 is going to be supported by this JSON spec. + +00:51:54 So you are going to get some of these weird edge cases + +00:51:56 maybe where Pydanic can do more than what the JSON spec can do on your behalf. + +00:52:00 There is a really cool library, though, called Instructor that actually-- + +00:52:04 Yes, I was just thinking about that. + +00:52:06 So what they do is actually kind of cute. + +00:52:09 The impression I have is that they acknowledge that PyDanic can actually do a bit more than what the JSON + +00:52:13 spec can do, especially because the validator that you write + +00:52:16 can really be a custom whatever thing you care about. + +00:52:19 So what they do as a trick is they say, well, if the LLM gives me something back that doesn't get validated + +00:52:25 by PyDanic, we take the error message, together with the thing that we got back, + +00:52:30 make a new prompt to the LLM and just ask it again. + +00:52:33 And see, maybe now with the hint, it is able to give us the proper structure back. + +00:52:37 That's sort of the term. + +00:52:38 It sounds silly, but anyone who's done agentic AI and stuff, + +00:52:41 like, you know, Claude or whatever, that's a lot of the magic where you say, + +00:52:44 it tried this. + +00:52:45 Oh, I see. + +00:52:46 I actually forgot an import. + +00:52:47 Oh, I see. + +00:52:48 I passed the wrong command. + +00:52:49 Let me try again a different way. + +00:52:51 Oh, now it's working. + +00:52:52 And, you know, I think there's a lot of potential with that. + +00:52:54 So, well, definitely true. + +00:52:56 The one thing I think is safe to say at this point, + +00:52:59 although another researcher, right? + +00:53:01 So don't pin me down on this. + +00:53:02 That library did originate from an early era of LLMs + +00:53:05 when the structured stuff wasn't really being trained for yet. + +00:53:09 And it is my impression that LLMs have gotten a fair bit better + +00:53:13 at this structured task at this point. + +00:53:15 So you could argue that maybe that library is solving + +00:53:18 a solved problem with the technique that they've got. + +00:53:21 But I still don't want to discount it, because you inevitably, at some point, + +00:53:25 you might want to go for a very lightweight model that you can run locally that isn't fine-tuned + +00:53:29 structured output task. And in that situation, this technique can actually do a lot of good. + +00:53:33 Like you might be prompting the LLM for like three times, but you might be able to get away + +00:53:37 with that because the model is just so much lighter because it doesn't have to do all those tasks that + +00:53:41 those big LLMs are trained for. Right. Especially if you're running on your machine or you're + +00:53:46 self-hosting the model, you're not like, oh, it's going to cost us so much money. It's just, + +00:53:49 it'll take a little longer, but it's fine. Yeah. So like there's a lot of support for + +00:53:53 instructor. There's also Olama support for the LM library from Simon Willison, + +00:53:57 by the way. But yeah, you can definitely imagine this is still going + +00:54:01 to be relevant with the Olama stuff going forward. + +00:54:04 And also, it's a reasonable trick if you think about it. + +00:54:08 The main thing that we do is we just add a retry mechanic to the structured outputs by using + +00:54:11 the PyDonic validator, effectively. + +00:54:13 I mean, think about what you probably do, certainly what + +00:54:15 I do when I'm working with ChatGPT or something. + +00:54:18 It gives me a wrong answer like, no, this is not what I + +00:54:19 wanted. This is what I wanted. It's like, oh, okay. + +00:54:21 I see. + +00:54:22 And it's kind of automating that, more or less. + +00:54:25 It's certainly an attempt, yeah. + +00:54:27 In my experience, I have also seen moments when it sends the same error back five times, + +00:54:33 and then it just kind of fails. + +00:54:34 So there is also a limit to the retry mechanic. + +00:54:37 Sure, sure. + +00:54:38 It's a little bit like tenacity or stamina. + +00:54:41 Exactly. + +00:54:41 APIs or whatever. + +00:54:43 It uses tenacity under the hood, if I'm not mistaken. + +00:54:45 So there's definitely that mechanism in there. + +00:54:47 Nice. + +00:54:48 I just-- didn't work. + +00:54:49 Try it again. + +00:54:50 It's like turn it off and turn it on again. + +00:54:51 See if it works now? + +00:54:52 Yep, pretty much. + +00:54:53 Television. + +00:54:54 Yeah, a little more smart than that, though, because it does validate and return the error. + +00:54:58 So super cool. + +00:54:59 Yep. + +00:54:59 Okay. + +00:55:00 Have you worked with, I guess, I don't know, higher order programming models? + +00:55:05 So, you know, I saw Samuel Colvin yesterday talking about Pydantic AI. + +00:55:10 I saw on X, I saw Sidney Ruckel talking about some big release, like a 1.0 release + +00:55:17 or something at LangChain. + +00:55:18 What are your thoughts on some of these higher order orchestration libraries? + +00:55:22 I mean, the main thing for me, at least, this was a demo I did do with the Pydanic AI one. + +00:55:29 I do think types are quite powerful when you think of LLMs. + +00:55:34 There's a live stream you can check on the Marimo YouTube + +00:55:38 if you're keen to see a full demo. + +00:55:39 But let me just sketch you a problem that used to be really hard. + +00:55:42 Six years ago, we used to work at this company called Raza. + +00:55:45 We work on chatbot software. + +00:55:47 And a thing that was really hard is you-- + +00:55:49 Let's say you're a chat bot, you can order a pizza. + +00:55:51 So you can do things like, yep, that's the one. + +00:55:54 It's like also one of the PyDonic AI engineers, and he's also the creator of Starlet, by the way. + +00:55:58 Cool project, Mariemost built on top of Starlet. + +00:56:01 Main maintainer of that project. + +00:56:02 He deserves to get more high fives from the community, + +00:56:04 so do you want to point it out? + +00:56:07 Well, let's say you're working on a pizza bot, and okay, someone is ordering a pizza. + +00:56:11 Suppose you go to the bot and you say, "Hey, I want to order a pepperoni pizza." + +00:56:15 But you can imagine there's a pizza type, and a pizza type says, "What's the kind of pizza? + +00:56:19 What's the size of the pizza? + +00:56:20 And do you want to have extra toppings? + +00:56:22 And that has to be a list of ingredients, let's say. + +00:56:24 But if I tell you, hey, you want to have a pepperoni pizza, + +00:56:29 but I don't know the size yet, well, then I need the LLM to ask a good follow-up question. + +00:56:33 OK, how are you going to build that logic? + +00:56:35 Because that can get quite gnarly quite quickly, especially if, in this case, the pizza is simple. + +00:56:39 But you can come up with a complicated object, like a complicated form almost. + +00:56:42 And then-- + +00:56:43 Right, right. + +00:56:43 I only need to ask this question if they've answered that one + +00:56:46 in that way. + +00:56:47 Right, it could get really tricky, yeah. + +00:56:49 Well, and then the question is, how can you actually get the LLM to do this for you? + +00:56:53 And there's a trick that you can do with a type, because what you can say is, + +00:56:56 well, the response from the LLM has to be of type pizza, + +00:56:59 or bar string. + +00:57:01 And then if the prompt says something along the lines of, + +00:57:03 well, parse the output of the user such that it is a pizza, + +00:57:06 but then if this validation error occurs, oh, how do you express that logic? + +00:57:10 Well, you can express it with a type. + +00:57:11 You just say, this is a string, or it's the type that I'm actually interested in the pizza. + +00:57:15 And then the LLM has to figure out, is the response either going to be the string where I'm asking the user for missing information + +00:57:21 or is it going to be the pizza object and if it's a pizza object then the code will actually talk + +00:57:24 to the back end so you do get to rethink how typing works because we have llms at our disposal now you + +00:57:30 can actually have the llm handle some of that logic as long as your type is like strict about + +00:57:34 it or like well defined i should say um it's it's a bit of a weird advanced feature of types and llms + +00:57:40 but i can definitely imagine uh some of the pydantic ai stuff does have does use this trick in order + +00:57:46 to perform some of the higher order stuff and maybe you don't need a full framework as long as you are + +00:57:51 really um like deliberate about your types is what i that that's kind of the mentality i try to + +00:57:56 have right now like let's interesting let's not worry about like the big framework just yet if we + +00:58:00 can just focus on the right types and make sure that all that stuff is kind of sane then you also + +00:58:04 keep the extractions at bay which is i think also convenient especially early on it sounds a little + +00:58:09 bit like an analogy you might be able to draw with databases like you could have like a really strong + +00:58:15 structured Postgres or relational database where the database thing, aka the AI, is in charge of it. + +00:58:22 Or you could be using an ODM or ORM where the code will only accept responses, right? So like, + +00:58:29 for example, you can get away talking to a document database if you have strongly typed + +00:58:34 models. If you're just passing dictionaries, you're going to end up in a bad place real quick + +00:58:38 because something's not going to match, right? So it's kind of like the Pydanic thing is being a + +00:58:42 check for if it's not quite perfect. I don't know. I feel like there's an analogy there. + +00:58:46 There's a bit of an analogy there. The main thing with the LLM is like, oh, how much of the logic do + +00:58:52 you want to control yourself as a guarantee? And how much would you like the LLM to do it? And + +00:58:57 oh, that's always kind of a struggle. Like what is the right glue for that? And it turns out that + +00:59:02 Pydantic is in a very funny, unique position to say, how about we just use types? And that actually + +00:59:07 It just kind of works. + +00:59:08 How ironic is that that that's such a popular thing coming out of a effectively untyped language? + +00:59:15 Yeah. + +00:59:16 Maybe that's why it's so popular. + +00:59:18 I think it partly is. + +00:59:19 I mean, it's my impression that there's a lot of this stuff also happening on the JavaScript + +00:59:23 side of things, if I'm honest. + +00:59:24 Now, for the JavaScript side of things, the argument is, of course, oh, you can also build + +00:59:27 the web UIs a bit easier and the user experience and that whole story. + +00:59:30 But they also do a lot of this validation stuff. + +00:59:32 I forget the library, but there's also-- + +00:59:35 I'm learning a little bit of JavaScript now. + +00:59:37 there's actually pydantic things happening in JavaScript land as well. + +00:59:40 Yeah, it's probably jidantic. + +00:59:44 Just-dantic. + +00:59:46 Just-dantic, yeah, j-s-dantic. + +00:59:49 Or TypeScript or whatever. + +00:59:50 But there's also like ORM libraries for JavaScript that are like type E, + +00:59:54 but not like super TypeScript just yet. + +00:59:56 Yeah, there's a lot of interesting analogies from the JavaScript space for us. + +00:59:59 You know, like the whole TypeScript team just rewrote everything in Go, + +01:00:04 whereas we're seeing a lot of stuff in Python being modified. + +01:00:06 Rust. Rust. Yeah, exactly. So pretty interesting. But let's, let's close this out a little bit with + +01:00:12 like running some of our own models. Okay. So clearly there's Anthropic, there's OpenAI, + +01:00:18 there's, you know, all the big, huge foundational models, but some of these things we're talking + +01:00:22 about, it's like this identity schema validation type of thing. And so on, you know, maybe we could, + +01:00:27 as you said, get away with running a smaller model, maybe small enough that we could run it + +01:00:30 ourselves on our servers or on my Mac mini here m2 pro 32 gigs ram i have the um the open ai + +01:00:39 20 billion parameter model running as like my my default llm for any my code that i write that + +01:00:45 needs to talk to an llm so um yeah so we actually have a video on the maroon channel that one but + +01:00:51 go check that out later but basically i've recorded that link we'll put in the show notes yeah no so + +01:00:56 I kind of find myself exploring all these different open source LLMs like once every two months or so. + +01:01:01 And then Marimo has a couple of interesting integrations that makes it really easy to either hook into Olama or, + +01:01:06 and that's like the only thing I might change on the course that we made, + +01:01:09 there's now also the service called Open Router. + +01:01:11 Like I don't know if you've ever seen that one. + +01:01:13 Yes, I've heard of it. + +01:01:14 Because I was looking at Klein.bot. + +01:01:17 Are you familiar with Klein.bot? + +01:01:18 Oh, yeah. + +01:01:19 Klein is cool. + +01:01:19 Klein is really cool. + +01:01:20 I've been using that for a bit. + +01:01:22 Tell me what Klein is real quick. + +01:01:24 So Klein is basically Copilot for VS Code, but the one thing that they do + +01:01:29 is they really let you do any model, like just plug it in. + +01:01:32 And they are really strict about plan mode versus like run mode, if it makes sense. + +01:01:36 So you really got to, and one, the UI is kind of cool. + +01:01:39 So like every time that you're like planning or doing whatever, + +01:01:42 you kind of see the progress bar of your context. + +01:01:45 And then you also see how much money it costs. + +01:01:47 So you're like always aware of the fact that like, okay, this thing is costing me bucks now. + +01:01:51 And it's just a refreshing take on GitHub Copilot. + +01:01:54 And I will say, you can't do sessions in parallel, because it is, I think, at the moment, still stuck to the VS Code + +01:02:00 extension ecosystem. + +01:02:01 Could be wrong with that. + +01:02:02 They just introduced a CLI. + +01:02:04 Ah, there you go. + +01:02:05 There you go. + +01:02:06 I bet you can do it with that. + +01:02:07 Yeah, so that's also-- + +01:02:09 the main thing that drew me in was the whole, oh, they're very strict about plan mode versus Go mode, + +01:02:14 which I think is definitely super useful. + +01:02:17 And on top of that, you have some nice UI. + +01:02:19 Yeah, it's also real similar to Cursor and others. + +01:02:21 And I think it's a big selling point. + +01:02:23 open source and they don't charge for inference so you just put in your api key and it just just + +01:02:28 passes through but you're right they do have like the as it's thinking it's got a little tick tick + +01:02:31 tick price of this opera and it's like oh it's up to three cents five you know it's just interesting + +01:02:37 to watch it go you also it's kind of like the fibonacci sequence you kind of realize that it's + +01:02:41 additive so it's like oh i've spent like half a dollar now oh and the next problem is going to be + +01:02:45 half a dollar plus extra the next one's going to be like that extra exactly it makes you manage your + +01:02:51 your context size real well. + +01:02:53 So anyway, OK, that's a bit of a diversion. + +01:02:54 But I think this is a fun tool to shout out. + +01:02:57 No, I agree. + +01:02:57 So the reason I brought this up is I saw they were number one on Open Router. + +01:03:02 I'm like, wait, what's Open Router? + +01:03:03 Yeah, so Open Router is basically you have one API key, + +01:03:06 and then they will route to any LLM that you like. + +01:03:09 And one thing that's really cool about it is that if there's a new open source model + +01:03:13 that you see all at Twitter and YouTube rave about, + +01:03:15 it'll be on Open Router. + +01:03:17 And then you can just give it a spin. + +01:03:18 If you have, let's say, stuff in your cache, and you've got a couple of functions that you use for evaluation, + +01:03:24 and you've got a setup for that, then it's just changing a single string to give a new LLM model a try. + +01:03:29 And they will make sure everything is nicely routed. + +01:03:32 And that's like the whole service also, basically. + +01:03:34 I think they add like a 5% fee hike or something like that, + +01:03:39 but they've got all sorts of GPU providers competing for the lowest price. + +01:03:42 So they do all the routing to all the LLM models, but also the open source ones. + +01:03:47 And that's the main convenient bit. + +01:03:48 Because I might not have-- so it's partially that I don't have the biggest GPU for the biggest models, + +01:03:55 but it's really convenient to just do kind of a for loop. + +01:03:57 Like, okay, take the 7B model from Qwenn, the 14B model, and just loop over all of them, + +01:04:02 and see if there's like a sweet spot as far as like quality is concerned and also model size. + +01:04:07 Like those kinds of experiments become a whole lot simpler if you just have one API that has all the models. + +01:04:11 And OpenRouter kind of gets you there. + +01:04:13 Yeah, okay, very neat. + +01:04:14 And it's also why Klein likes using it. + +01:04:16 So they have a direct connection with Klein. + +01:04:18 Just in the user interface, you can say, well, I want to have this model from Open Router. + +01:04:21 Just click, click, click, and it's all sort of ready to go. + +01:04:24 But that's their service. + +01:04:25 OK, very neat. + +01:04:26 So maybe for people who haven't run their own models, + +01:04:30 there's a couple of options. + +01:04:32 Sounds like you recommend Olama. + +01:04:34 I mean, that's the one that I have used. + +01:04:35 You've used LM Studio, I think? + +01:04:37 I use LM Studio because LM Studio has this UI for discovering and setting them up + +01:04:42 and playing and configuring them. + +01:04:43 But then you can just say, run in dev mode, and it just runs, and here we go, + +01:04:47 an open AI compatible-- + +01:04:49 - There you go. - API endpoint on my local network. + +01:04:51 And so I can just point anything that can do ChatGPT, + +01:04:54 I can point it at this, whatever's running there. + +01:04:56 And I was running the GPT OSS 20 billion one last time I tried. + +01:05:00 - There you go. + +01:05:01 Okay. + +01:05:01 So Ollama also has like a very similar thing. + +01:05:04 My impression is that LM Studio in the end is like pretty similar, + +01:05:07 just has a slightly different UI. + +01:05:09 It's a bit more of an app, I suppose. + +01:05:10 Like Ollama is definitely more of a command line utility. + +01:05:13 Does the job though, I would say, but it doesn't have that much to offer in the UI department. + +01:05:18 Yeah, I think if I were ever to try to set it up on my server, + +01:05:21 I'd probably just do Olama, maybe inside a Docker or something. + +01:05:24 I don't know, something like that. + +01:05:25 Yeah, not true. + +01:05:27 The one sort of mysterious thing with a lot of these services, + +01:05:29 like how do they make money? + +01:05:31 And I think I read somewhere that Olama is going to start doing + +01:05:35 open router kind of stuff as well, so it wouldn't surprise me if they ended up going in that direction. + +01:05:40 But I mean, there's all sorts of vendors, and there's another thing I'm also seeing a lot of nowadays + +01:05:45 use like big models that generate a data set such that they can take a super tiny model and fine + +01:05:49 tune on the data that was generated by the big model and that way like a future maybe that we + +01:05:53 might have is that you have a dutch model that's really good for legal maybe we'll have like a + +01:05:58 super bespoke lm for that at some point or like a um like an lm that's like really good at css but + +01:06:04 only css like that's also a thing that we might have and then those models can be really tiny and + +01:06:09 run on your laptop and who knows this is the age we live in um but i certainly can foresee a world + +01:06:14 like that where it's like okay i'm gonna work on a project instead of spinning up a general world + +01:06:19 knowledge model i'm gonna just let it on demand go okay i have a css question well what framework + +01:06:24 are you using i'm using tailwind okay we're gonna spend up the tailwind llm model and we need to ask + +01:06:29 it a question yeah we're gonna shut that you know what i mean like really focus but then have some + +01:06:33 kind of orchestrating thing that says now i gotta talk to this now let me let me work with that one + +01:06:37 but that could be great yeah and it might also be a solution to the whole oh the time cut off is + +01:06:41 January to 2025. + +01:06:43 And like everything happened after we have no knowledge about. + +01:06:46 Like it could also be a way for us to have a system where we just retrain a thing weekly + +01:06:50 and it's just a diff on top or something like that. + +01:06:52 Right. + +01:06:53 Yeah. + +01:06:53 It could also happen. + +01:06:54 Yeah. + +01:06:54 And if it were small enough, then it would be okay. + +01:06:56 Yeah. + +01:06:57 It would be kind of like SETI at home. + +01:06:58 All of our stuff would just be like kind of have the fan going. + +01:07:02 The, I mean, it's a bit of a science fiction thing, but you can imagine at some point every + +01:07:06 home will have solar panels and then maybe every home will also just have a GPU and maybe + +01:07:10 we won't have huge-ass server farms. + +01:07:12 It's just that we have a network, and everyone has a GPU, and if my neighbor suddenly + +01:07:16 wants to do math, then he can borrow my GPU. + +01:07:18 I don't know. We'll see. + +01:07:21 It kind of sounds cool, doesn't it? + +01:07:22 Yeah. If people like distributed systems, + +01:07:26 might as well go for it. + +01:07:29 If you want to add to the unpredictability of + +01:07:32 the LLM, mix in a distributed grid computing that you don't know about too much either, + +01:07:36 and then you'll get some real... + +01:07:38 I do want to give one quick shot on this one front, though, because I do think + +01:07:41 it's an idea that's worth pursuing. + +01:07:43 So in Holland, we've got this one cloud provider called + +01:07:46 LeafCloud, and I think they're still around. + +01:07:48 But what they do is they install these server farms, + +01:07:51 like what we're discussing. + +01:07:52 They have a GPU rack at the bottom of an apartment building. + +01:07:57 And any excess heat, which is one of the main cost drivers, + +01:08:01 that's actually used to preheat the water for the rest + +01:08:03 of the apartment. + +01:08:04 Yeah. + +01:08:05 So on paper, at least, you do have something that's not only + +01:08:09 carbon negative, but also more cost effective because the person, like the waste product of the + +01:08:13 of the compute is now a thing that the heat that someone is willing to pay for as well. So + +01:08:18 on paper, you can also be cheaper if you play your cards right. + +01:08:21 I ran across this and I was really inspired by it. I ended up not using LeafCloud for anything, + +01:08:27 but it's a cool idea. + +01:08:28 So the one downside of that approach is you do have to be close to a data center because + +01:08:33 do you really want to store your private data on disk in an apartment building or in a secure + +01:08:37 facility. So what they do is they just have glass fiber cables. Like the disc is in the facility, + +01:08:42 but the memory and the GPU and the computer is in the apartment building. That's sort of the way + +01:08:45 it's like a mounted volume over fiber, sort of like a network volume over fiber. Sure. I think + +01:08:51 that's pretty common. A lot of data center setups, it just happens to be spanning an apartment + +01:08:56 complex, which is unusual. Oh, it's also only for certain types of compute loads, right? You're + +01:09:00 probably not going to do your web app fire that apartment building. It's probably not going to be + +01:09:04 productive but if you're doing like big simulation big like deep learning things like that actually + +01:09:09 everyone everyone knows you can do machine training machine learning training and it's all good for + +01:09:15 serverless you just send the stuff out it just goes finds an apartment complex runs it serverless and + +01:09:20 it comes back right now just kidding so yeah it's a pretty neat idea yeah there's um we'll see what + +01:09:25 the future brings like when i did data science the joke i always made is like as a data scientist i + +01:09:29 I know for sure that the future is really hard to predict. + +01:09:35 Maybe the past is easier to predict, and that's what we do in machine learning. + +01:09:38 We try to predict the past, and yeah, we'll see what these LLMs do. + +01:09:41 But until then, again, also the repeating lesson from the course, + +01:09:46 there's some good habits, but in the end, it's not so much the LLM, it's more the stuff around + +01:09:50 and how you think about it. + +01:09:51 And also, small shout-out to Marimo. + +01:09:53 I do think having some UI in the blend as well, such as you can say, oh, there's a text box here, + +01:09:58 another text box there. + +01:09:59 you can kind of play around making your own little LLM app. Like the thing that I typically do is I + +01:10:04 hook up Gemini to my YouTube videos to make like summaries for my blog. I'd like try to find little + +01:10:08 excuses to like improve your life by automating something with these LLM tools. And that's a nice + +01:10:13 way to get started. Yeah, there's really, I think we all probably have little opportunities to automate + +01:10:18 some low stakes thing, but that we spend a lot of time with or hassle or whatever and let an LLM + +01:10:23 have at it and write a little Python to make that happen, right? + +01:10:25 Yeah, and also expose yourself to your own slop as well. + +01:10:30 The nice thing about building your own apps is also because you then suddenly start to realize when it's also still good to be a human. + +01:10:35 Like, I'm not going to have an LLM automate messages to my wife and my kid. + +01:10:40 That's just not going to happen because I've seen what these LLMs can output. + +01:10:43 Like, that's going to remain on the human side of the spectrum. + +01:10:46 Exactly. + +01:10:47 But making a summary for my blog on a video, I mean, that feels in the ballpark. + +01:10:51 100%. + +01:10:52 All right. + +01:10:52 Final word for people interested in this topic they want to build. + +01:10:56 I want to plug in LLMs as part of that building block. + +01:10:59 What do you say? + +01:11:00 Open Router is nice and flexible. + +01:11:02 LLM by Simon Willison is great. + +01:11:04 Disk cache is a thing that you do want to disk cache. + +01:11:09 Maybe a final thing that I do maybe want to... + +01:11:12 It's also fine if you just want to learn Python as well. + +01:11:15 There's some of this talk of people saying, oh, we never have to... + +01:11:19 Oh, actually, people from Peru are tuning in. + +01:11:21 Peru is amazing. + +01:11:22 from seeing people sort of replying the conversation. + +01:11:24 Yeah, absolutely. + +01:11:25 Oh, sweet. + +01:11:26 Best food of my life, Peru. + +01:11:28 Also, if you're looking for great excuses to learn Python, + +01:11:31 there's plenty. + +01:11:31 If you're looking for great excuses to go to Peru, food. + +01:11:35 But also, I think in the end, you are going to be better at using LLMs + +01:11:41 if you can still do things yourself. + +01:11:42 One thing I've started noticing is that it's a bit embarrassing, but like, + +01:11:45 oh, what's the CSS property of that thing? + +01:11:47 I used to know that. + +01:11:48 But nowadays, I just, oh, am I seriously going to ask this, the ChatGPT? + +01:11:51 This should be in my memory for Pete's sakes. + +01:11:53 Yes. + +01:11:54 And sometimes it's like, I'm going to have a whole conversation with ChatGPT + +01:11:57 so I can change four characters in a style sheet. + +01:11:59 What is this nonsense? + +01:12:00 So part of me is also thinking like, definitely expose yourself to LLMs. + +01:12:04 And if that inspires you, that's great. + +01:12:06 But also try to not overly rely on it either. + +01:12:10 It's like I'm building flashcard apps for myself. + +01:12:12 So I'm still kind of in the loop, if that makes sense. + +01:12:15 So you've got to be a guardian of your own mind. + +01:12:19 And yes, so I can see it so easily. + +01:12:21 that you just become a machine that just says next okay yes continue and then in six months you're + +01:12:28 like gosh i can't program anything i just got to ask the chat and it's it's a problem yeah it's + +01:12:33 it's something well and also um and that kind of gets back to the point i made earlier um you want + +01:12:38 to be very mindful that if you if you have better tools you should also have better ideas and if + +01:12:43 that's not the case then something is wrong because that's then you get into the self-learned + +01:12:48 helplessness territory. And that's the one thing I do want to kind of guard against. + +01:12:52 Andre Carpathy gave a great interview a little while ago where he mentioned the + +01:12:56 worst outcome of humanity. Did you see the movie WALL-E? Remember seeing that? + +01:13:00 Oh my gosh. That movie really made me sad. + +01:13:04 It's a good movie though. It's a fantastic movie, but as an adult, + +01:13:08 it's got a hard undertone to it. Oh my gosh. + +01:13:11 And especially now, because for people who never watched the movie, there's a scene in the movie where you see what humanity + +01:13:16 comes to and it's basically every everything has to be convenient so people are overweight + +01:13:20 they're in these like bikes that move them around they're always able to watch television drink a + +01:13:24 milkshake and eat a pizza like every convenience is sort of taken care of but you also just see + +01:13:29 that they're not happy and they can't even walk out like they can't they're like just they're so uh + +01:13:35 just dependent on the thing it's bad exactly so that's a sweet show it's a very sweet show i + +01:13:39 recommend it yeah it's it's a sweet movie but it's like if at some point you notice like hey + +01:13:44 learning feels hard, then maybe that's fine because maybe it's just like exercising. + +01:13:49 It's okay if that's a bit painful. + +01:13:51 In fact, if it's not painful, then maybe just participating in entertainment instead as well. + +01:13:54 So that's the thing. + +01:13:56 These are thoughts that I'm sort of battling with. + +01:13:57 I'm also sort of trying to be really critical of like, okay, I work at Marimo now. + +01:14:01 I've accessed all these LLMs and stuff. + +01:14:03 And can I build more awesome software that I wouldn't have before? + +01:14:06 One thing I made a while ago is I added gamepad support to Marimo Notebooks. + +01:14:10 So you can interact with Python with the gamepad and stuff. + +01:14:12 So this is just a experiment. + +01:14:14 But I would be really disappointed if I'm the only person making packages that I've never made before that are definitely trying to reach new heights. + +01:14:23 It does feel a bit lonely because it does feel like more people should do it. + +01:14:26 That's like the one sort of hint I want to give to people. + +01:14:29 Try to inspire yourself a bit more and do more inspirational stuff so more cool things happen on my timeline. + +01:14:34 There's less AI sloth. + +01:14:35 There's really cool software being built, people learning and all. + +01:14:38 Hear, hear. + +01:14:38 I appreciate that. + +01:14:39 All right. + +01:14:40 Well, Vincent, thanks for being back, sharing all these techniques. + +01:14:44 And yeah, I hope people go out there and build using AIs, not just with AIs. + +01:14:48 Definitely. + +01:14:49 Y'all have a good one. + +01:14:50 Yeah, see you later. + +01:14:50 Bye. + +01:14:51 This has been another episode of Talk Python To Me. + +01:14:54 Thank you to our sponsors. + +01:14:55 Be sure to check out what they're offering. + +01:14:57 It really helps support the show. + +01:14:59 Take some stress out of your life. + +01:15:00 Get notified immediately about errors and performance issues in your web + +01:15:04 or mobile applications with Sentry. + +01:15:06 Just visit talkpython.fm/sentry and get started for free. + +01:15:11 And be sure to use the promo code Talk Python, all one word. + +01:15:15 And it's brought to you by Nordstellar. + +01:15:18 Nordstellar is a threat exposure management platform from the Nord security family, + +01:15:22 the folks behind NordVPN that combines dark web intelligence, + +01:15:26 session hijacking prevention, brand and domain abuse detection, + +01:15:31 and external attack surface management. + +01:15:33 Learn more and get started keeping your team safe at talkpython.fm/Nordstellar. + +01:15:38 If you or your team needs to learn Python, we have over 270 hours of beginner and advanced courses on topics ranging from complete beginners to async code, Flask, Django, HTML, and even LLMs. + +01:15:52 Best of all, there's not a subscription in sight. Browse the catalog at talkpython.fm. + +01:15:57 Be sure to subscribe to the show. Open your favorite podcast player app. Search for Python. We should be right at the top. + +01:16:03 If you enjoy the geeky rap theme song, you can download the full track. + +01:16:07 The link is your podcast player show notes. + +01:16:09 This is your host, Michael Kennedy. + +01:16:11 Thank you so much for listening. + +01:16:12 I really appreciate it. + +01:16:13 Now get out there and write some Python code. + +01:16:25 Talk Python To Me, and we ready to roll Upgrading the code, no fear of getting old + +01:16:36 We tapped into that modern vibe, overcame each storm + +01:16:40 Talk Python To Me, I-Sync is the norm you + diff --git a/transcripts/528-python-apps-with-llm-building-blocks.vtt b/transcripts/528-python-apps-with-llm-building-blocks.vtt new file mode 100644 index 0000000..56894d3 --- /dev/null +++ b/transcripts/528-python-apps-with-llm-building-blocks.vtt @@ -0,0 +1,4865 @@ +WEBVTT + +00:00:00.020 --> 00:00:05.540 +In this episode, I'm talking with Vincent Wormerdam about treating LLMs as just another API in your + +00:00:05.650 --> 00:00:11.260 +Python app with clear boundaries, small focused endpoints, and good monitoring. We'll dig into + +00:00:11.410 --> 00:00:16.940 +patterns for wrapping these calls, caching and inspecting responses, and deciding where an LLM + +00:00:17.080 --> 00:00:22.900 +API actually earns its keep in your architecture. This is Talk Python To Me, episode 528, + +00:00:23.530 --> 00:00:25.660 +recorded October 23rd, 2025. + +00:00:44.100 --> 00:00:48.240 +Welcome to Talk Python To Me, the number one podcast for Python developers and data scientists. + +00:00:48.800 --> 00:00:53.860 +This is your host, Michael Kennedy. I'm a PSF fellow who's been coding for over 25 years. + +00:00:54.540 --> 00:00:55.620 +Let's connect on social media. + +00:00:55.960 --> 00:00:59.340 +You'll find me and Talk Python on Mastodon, Bluesky, and X. + +00:00:59.660 --> 00:01:01.440 +The social links are all in the show notes. + +00:01:02.020 --> 00:01:06.160 +You can find over 10 years of past episodes at talkpython.fm. + +00:01:06.580 --> 00:01:09.880 +And if you want to be part of the show, you can join our recording live streams. + +00:01:10.420 --> 00:01:10.780 +That's right. + +00:01:10.820 --> 00:01:14.520 +We live stream the raw, uncut version of each episode on YouTube. + +00:01:15.160 --> 00:01:19.620 +Just visit talkpython.fm/youtube to see the schedule of upcoming events. + +00:01:20.120 --> 00:01:24.460 +And be sure to subscribe and press the bell so you'll get notified anytime we're recording. + +00:01:24.960 --> 00:01:26.880 +This episode is brought to you by Sentry. + +00:01:27.400 --> 00:01:28.640 +Don't let those errors go unnoticed. + +00:01:28.820 --> 00:01:30.460 +Use Sentry like we do here at Talk Python. + +00:01:30.980 --> 00:01:33.880 +Sign up at talkpython.fm/sentry. + +00:01:34.400 --> 00:01:36.340 +And it's brought to you by NordStellar. + +00:01:36.920 --> 00:01:39.400 +NordStellar is a threat exposure management platform + +00:01:40.000 --> 00:01:41.220 +from the Nord security family, + +00:01:41.520 --> 00:01:42.660 +the folks behind NordVPN + +00:01:43.240 --> 00:01:44.820 +that combines dark web intelligence, + +00:01:45.440 --> 00:01:47.000 +session hijacking prevention, + +00:01:47.600 --> 00:01:49.680 +brand and domain abuse detection, + +00:01:50.020 --> 00:01:51.660 +and external attack surface management. + +00:01:52.240 --> 00:01:54.379 +Learn more and get started keeping your team safe + +00:01:54.400 --> 00:02:01.340 +at talkpython.fm/nordstellar. Hey, welcome to Talk Python To Me. Hi. Welcome back. Pleasure + +00:02:01.350 --> 00:02:06.200 +to be here. Yeah, it's been like, this is our second Talk Python interview, I think. We've + +00:02:06.300 --> 00:02:10.160 +interacted before also on socials and all that, but like, yeah, second time. Yay. Happy to be here. + +00:02:10.460 --> 00:02:15.759 +Yeah. Yeah. It's very exciting to have you back. I, you know, I'm trying to remember, I did an + +00:02:15.780 --> 00:02:18.060 +blog post on + +00:02:18.480 --> 00:02:20.100 +the most popular episodes + +00:02:20.780 --> 00:02:21.600 +of last year. + +00:02:22.140 --> 00:02:24.139 +I think, if I do recall correctly... + +00:02:25.240 --> 00:02:26.200 +Yeah, the + +00:02:26.260 --> 00:02:28.040 +number one last year was our + +00:02:28.220 --> 00:02:29.480 +interaction, I think, on Spade. + +00:02:29.800 --> 00:02:31.260 +That's not terrible, is it? + +00:02:31.840 --> 00:02:33.420 +Yeah, it was a bragging right for a day. + +00:02:34.140 --> 00:02:36.120 +Yeah, it's pretty cool. So your episode + +00:02:36.800 --> 00:02:38.100 +and double bragging + +00:02:38.300 --> 00:02:39.900 +rights as your episode was released + +00:02:40.220 --> 00:02:41.680 +in, I don't know when that was, + +00:02:42.060 --> 00:02:44.239 +something like October or something, almost exactly + +00:02:44.260 --> 00:02:49.700 +a year ago to the day. And still it was the number one most downloaded for the whole year. So + +00:02:50.000 --> 00:02:56.680 +fantastic. And I think that might be an indication of where we're going to another very exciting + +00:02:56.980 --> 00:03:01.340 +topic, you know? Yeah. Like ML and AI only went up, right? So, so. + +00:03:03.100 --> 00:03:08.120 +You know what? I would love to get your thoughts on this. I, I'm seeing out on the internet and I + +00:03:08.200 --> 00:03:13.580 +want to put this up as kind of a, to counter this, this thought or whatever. But, but I'm seeing a + +00:03:13.580 --> 00:03:20.640 +lot of people get really frustrated that AI is a thing. You know, OpenAI just released their Atlas + +00:03:21.180 --> 00:03:26.520 +browser, which I'm not super excited about, but it's like their version of what an AI browser + +00:03:26.820 --> 00:03:32.640 +wrapping Chromium looks like, you know what I mean? And it's fine. But the comments in Ars Technica, + +00:03:32.840 --> 00:03:37.060 +which are usually pretty balanced, are just like people losing it over the fact that it has AI. + +00:03:37.640 --> 00:03:42.060 +Are people tired of this stuff? Or why do you think that's the reaction, I guess, is really what I + +00:03:42.020 --> 00:03:47.780 +want to ask you i mean i guess i have like a couple of perspectives but like i do like i think i could + +00:03:47.900 --> 00:03:52.660 +say like we are all tired that like every tech product out there is trying its best to find a + +00:03:52.760 --> 00:03:58.400 +way to put ai in the product like oh any any gmail any calendar app let's put ai in there and it + +00:03:58.500 --> 00:04:02.580 +gets a bit tiring like i wouldn't be too surprised if like fire hoses and refrigerators also come with + +00:04:02.700 --> 00:04:09.439 +ai features these days like it feels a bit nuts i know it's like what what is my refrigerating ai for + +00:04:09.460 --> 00:04:39.260 +Yeah, or like soccer ball of whatever object, like it feels like people are overdoing the AI thing. So okay, that can be like legitimate source of frustration. But then it also becomes a little bit personal, because I can also imagine if you put like, your heart and soul into like being a good engineer, and like you want to take the internet serious, like you consider it something is somewhat sacred, because it helped you so much in your career, then I definitely can also imagine like, Oh, my God, please don't turn that into a slop. Like I built a career on that. So that might be the source of all these things. I guess like my main way of dealing with it, + +00:04:39.460 --> 00:04:43.500 +is more along the lines of like, okay, like, this genie is not going to go back into the + +00:04:43.670 --> 00:04:47.280 +ball in a way. But if we're able to do more things with these LLMs, if they're able to + +00:04:47.540 --> 00:04:52.420 +automate more boilerplate and that sort of thing, then it seems that then I can try and + +00:04:52.500 --> 00:04:56.180 +get better at natural intelligence as well, instead of this artificial intelligence. So + +00:04:56.280 --> 00:05:01.000 +like, okay, I can vibe code all these things now. So the bottleneck is just do I have good + +00:05:01.220 --> 00:05:06.020 +ideas? And maybe I should invest in good ideas. And oh, maybe I can actually learn about + +00:05:06.260 --> 00:05:09.420 +JavaScript, but then I should not focus on the syntax, but it shouldn't focus maybe on + +00:05:09.440 --> 00:05:14.860 +the browser and like how does CSS work and oh I should actually maybe do a little bit of stuff + +00:05:15.000 --> 00:05:19.680 +with flashcards just so I know the syntax just enough so I can review okay like I try to be very + +00:05:19.980 --> 00:05:23.940 +conscious about it that way that's kind of my approach more and I it's easy to get really lazy + +00:05:24.140 --> 00:05:28.480 +right and just push the button and say do the next thing and not use it as an opportunity like oh it + +00:05:28.510 --> 00:05:33.640 +did something I didn't know let me have a conversation and try to understand the uh I + +00:05:33.640 --> 00:05:37.320 +didn't know I think the key word here is you want to be deliberate about it like I think if you can + +00:05:37.280 --> 00:05:41.780 +sort of say like okay i'm deliberate about the way i use this stuff and i'm also learning as i'm doing + +00:05:41.980 --> 00:05:46.800 +this like one thing i really like to do is i just give give like some sort of bonkers front-end task + +00:05:46.940 --> 00:05:50.840 +to the llm that i have no idea how i would solve it and then study the output like that's the thing + +00:05:50.900 --> 00:05:56.480 +i actually do once in a while um but yeah i mean i do get it that like people have mixed emotions + +00:05:56.680 --> 00:06:01.140 +about it and that's totally cool and fine it's just that for me in my situation this is how i + +00:06:01.480 --> 00:06:07.240 +deal with it yeah i think that's fair i think also i i definitely have long since been tired of like + +00:06:07.260 --> 00:06:09.960 +every email app having to rewrite this with AI + +00:06:10.620 --> 00:06:12.160 +and it usually destroys the email. + +00:06:12.840 --> 00:06:14.180 +You know, like removes all the formatting. + +00:06:14.710 --> 00:06:15.420 +And then you just, you know, + +00:06:15.520 --> 00:06:16.980 +there's always the cartoon of like, + +00:06:17.300 --> 00:06:19.060 +I wrote an email as bullet points. + +00:06:19.150 --> 00:06:20.660 +I asked AI to expand it. + +00:06:20.690 --> 00:06:21.460 +I sent it to you. + +00:06:21.750 --> 00:06:22.700 +You're like, what a long email. + +00:06:23.100 --> 00:06:24.900 +I'm gonna ask AI to summarize this. + +00:06:25.260 --> 00:06:26.480 +We should have just sent the bullet points. + +00:06:27.620 --> 00:06:30.700 +- Yeah, I mean, I do fear the amount of slop though. + +00:06:30.860 --> 00:06:33.460 +Like I do, it almost feels like every good tech YouTuber, + +00:06:33.860 --> 00:06:35.560 +there aren't a lot of good tech YouTubers anymore + +00:06:35.580 --> 00:06:39.960 +they're all doing AI stuff instead of doing like actual really great programming. There's still a + +00:06:40.020 --> 00:06:45.300 +few. Anthony writes code is like a really good one still. Like he's awesome. Definitely follow + +00:06:45.620 --> 00:06:51.180 +him if you want to learn Python. He does cool stuff. But yeah, the main thing, I guess from my + +00:06:51.300 --> 00:06:55.520 +like selfish perspective, YouTube used to be better because nowadays it's all about the AI and you've + +00:06:55.580 --> 00:06:59.360 +always got this thumbnail of a guy pulling his hair out like, oh my God, this changes everything. + +00:06:59.840 --> 00:07:02.120 +Yeah. I would love to see a video that's maybe not that. + +00:07:03.060 --> 00:07:05.060 +Have you heard that there's an AI bubble that's going to burst? + +00:07:05.240 --> 00:07:12.820 +um yeah well we'll see exactly how far i think so actually i mean we can speculate like i i can + +00:07:12.950 --> 00:07:16.600 +argue that it's been over invested i can also argue there's still so much potential the one + +00:07:16.600 --> 00:07:21.280 +thing i will say though um we have clod kinds of tools and like we're all looking forward to like + +00:07:21.280 --> 00:07:24.920 +the vibe coding oh it's never been easier and that sort of thing but i would have expected an explosion + +00:07:24.930 --> 00:07:29.540 +of awesome apps to be going along with it and it doesn't necessarily feel like we have much better + +00:07:29.700 --> 00:07:34.680 +software even though we supposedly have way better tools so something about that is making me a little + +00:07:34.700 --> 00:07:38.020 +a bit suspicious, but it might also just be that the apps that are being built that I'm not the + +00:07:38.170 --> 00:07:42.320 +target audience for, because I can imagine that. Because let's say you have something awesome for + +00:07:42.500 --> 00:07:46.620 +dentists, they can still be an awesome crud app that you could build with Claude, but I'm not a + +00:07:46.720 --> 00:07:52.820 +dentist. So, you know, wow, there's a lot of new cool dentistry management apps. You're right. + +00:07:53.280 --> 00:07:58.600 +If a dentist can outprogram, right? Like I do believe in the story that, oh, as a dentist, + +00:07:58.810 --> 00:08:02.840 +you might be best equipped to write an app for dentists, right? And if they're now more empowered + +00:08:02.860 --> 00:08:04.580 +to maybe do some stuff with code. + +00:08:04.670 --> 00:08:06.080 +I mean, there's a story there + +00:08:06.200 --> 00:08:07.480 +that every single niche profession + +00:08:07.840 --> 00:08:10.160 +will have someone who can do enough clod + +00:08:10.190 --> 00:08:11.480 +to make the app for that profession. + +00:08:12.420 --> 00:08:12.520 +Yeah. + +00:08:13.240 --> 00:08:13.940 +Well, time will tell. + +00:08:13.940 --> 00:08:16.460 +It used to be that you have to have a programming skill + +00:08:16.660 --> 00:08:18.900 +and a little bit of, let's say, dentistry experience + +00:08:19.440 --> 00:08:21.660 +to build the right sort of specialized vertical. + +00:08:21.960 --> 00:08:23.160 +And now I think it maybe is reversed. + +00:08:23.560 --> 00:08:24.800 +You need a lot of dentistry + +00:08:25.110 --> 00:08:27.800 +and a little bit of coding skill to go along with an AI. + +00:08:28.260 --> 00:08:28.380 +Yeah. + +00:08:29.440 --> 00:08:32.039 +The character sheet used to be 80 points here, 20 points there. + +00:08:32.140 --> 00:08:34.260 +and now it's 80 points there and 20 points here. + +00:08:34.520 --> 00:08:34.800 +Yes, exactly. + +00:08:35.219 --> 00:08:36.260 +That's exactly what I was thinking. + +00:08:36.300 --> 00:08:37.820 +Like, I think maybe that's actually shifted + +00:08:38.200 --> 00:08:39.039 +until you said that. + +00:08:39.039 --> 00:08:39.960 +I've never really thought about it, + +00:08:40.039 --> 00:08:41.000 +but it just may be. + +00:08:41.539 --> 00:08:44.260 +I do want to point out for people listening, + +00:08:44.420 --> 00:08:47.020 +we're not going to really talk too much + +00:08:47.180 --> 00:08:51.280 +about like using agentic tools to write code for us. + +00:08:51.400 --> 00:08:52.580 +Like we have been speculating + +00:08:52.720 --> 00:08:53.820 +about this theoretical dentist. + +00:08:55.100 --> 00:08:56.460 +But what instead we're going to do + +00:08:56.680 --> 00:08:58.660 +is we're going to talk to you, Vincent, + +00:08:59.020 --> 00:09:01.560 +about how can I use an LLM + +00:09:01.700 --> 00:09:08.020 +like a library or an API to add functionality, features, behaviors to an existing data science + +00:09:08.300 --> 00:09:12.340 +tool or to an existing web app or whatever it is we're building, right? + +00:09:12.640 --> 00:09:12.880 +Yes. + +00:09:13.360 --> 00:09:18.480 +So also people might probably know it, but I made a course for Talk Python, of course, + +00:09:18.480 --> 00:09:20.480 +and we're going to talk about some of those details. + +00:09:21.000 --> 00:09:25.820 +But the idea for that course was not necessarily like, how can I use LLMs to build me software? + +00:09:26.100 --> 00:09:29.260 +It's more, oh, how can I build software that also uses LLMs under the hood? + +00:09:29.340 --> 00:09:33.240 +Like if I have an app that makes a summary, how can I make sure the summary is reliable? + +00:09:33.540 --> 00:09:36.680 +And basically get the ball rolling on those building blocks. + +00:09:36.860 --> 00:09:37.720 +That's what that course is about. + +00:09:37.860 --> 00:09:41.200 +That's also probably going to be the main bones of this topic as opposed to a dentist + +00:09:41.460 --> 00:09:43.480 +Bob and his ambitions to make a new tech startup. + +00:09:44.340 --> 00:09:44.700 +Exactly. + +00:09:45.040 --> 00:09:47.280 +We're going to grow new teeth if we just could get the software right. + +00:09:47.780 --> 00:09:48.920 +Now, yeah. + +00:09:48.990 --> 00:09:49.860 +So how do you do that? + +00:09:49.890 --> 00:09:51.040 +And we're going to talk about that before. + +00:09:51.390 --> 00:09:56.240 +I just want to give you a chance before we really dive too far into the details of how + +00:09:56.260 --> 00:10:00.400 +we make that work is just, you know, give people a sense of what you've been up to lately. + +00:10:00.540 --> 00:10:02.940 +You've done a lot of cool stuff with CalmCode.io. + +00:10:04.120 --> 00:10:07.540 +I can see your YouTube game is growing strong here. + +00:10:08.559 --> 00:10:10.860 +And you've been doing a lot of data science at Marimo. + +00:10:11.340 --> 00:10:12.180 +Yeah, what's... + +00:10:12.420 --> 00:10:12.580 +Right. + +00:10:12.720 --> 00:10:15.900 +Yeah, so we haven't spoken in a year, so maybe we should catch up. + +00:10:16.500 --> 00:10:19.480 +So long story short, CalmCode is still a project that I maintain. + +00:10:19.800 --> 00:10:24.500 +It's basically 99% free educational content on Python. + +00:10:24.780 --> 00:10:26.520 +that thing is just going to maintain itself. + +00:10:27.540 --> 00:10:28.760 +Super happy to keep maintaining it. + +00:10:29.190 --> 00:10:30.500 +The main thing I'm doing with that nowadays + +00:10:30.670 --> 00:10:32.500 +is just every two months I switch cloud providers + +00:10:32.710 --> 00:10:33.560 +just because I can, + +00:10:33.610 --> 00:10:35.740 +and then I can sort of see what it's like on the other side. + +00:10:36.380 --> 00:10:37.100 +So that's the thing that I do. + +00:10:37.680 --> 00:10:38.480 +From Calm Cult, though, + +00:10:38.650 --> 00:10:40.080 +I figured I might also start a YouTube channel, + +00:10:40.210 --> 00:10:41.600 +and that ended up being a YouTube channel + +00:10:41.630 --> 00:10:42.980 +where I still talk about Python stuff, + +00:10:43.090 --> 00:10:45.760 +but I mainly talk about these fancy ergonomic keyboards + +00:10:46.010 --> 00:10:47.240 +because I had a couple of RSI issues. + +00:10:47.960 --> 00:10:48.740 +In a year's time, + +00:10:48.890 --> 00:10:51.080 +I went from 100 subscribers to 5,000 something, + +00:10:51.310 --> 00:10:52.760 +so this thing kind of took off. + +00:10:53.160 --> 00:10:53.460 +That's awesome. + +00:10:53.560 --> 00:10:55.680 +I actually have sponsors now. + +00:10:55.750 --> 00:10:59.980 +So I actually have a couple of companies in Asia sending me their custom boards for me to review. + +00:11:00.110 --> 00:11:01.640 +So that's been a really fun journey. + +00:11:02.090 --> 00:11:04.140 +But since last time we spoke, I also switched employers. + +00:11:04.400 --> 00:11:07.420 +So people might have heard of Jupyter before. + +00:11:07.720 --> 00:11:10.400 +That's an environment in Python where you can do interactive things. + +00:11:10.900 --> 00:11:15.320 +And I work now for a company that makes Maremo, which is an alternative, a very likable one. + +00:11:15.390 --> 00:11:16.820 +It does work completely differently. + +00:11:17.520 --> 00:11:23.260 +One of the main things that attracts people to it is the fact that these kinds of notebooks are Python files under the hood. + +00:11:23.520 --> 00:11:25.000 +They're also a little bit more interactive. + +00:11:25.180 --> 00:11:27.420 +You can do more rapid prototyping with UI. + +00:11:27.680 --> 00:11:29.240 +You can blend Python with it very nicely. + +00:11:30.260 --> 00:11:34.120 +So I will say like all of my rapid prototyping, especially with LLMs, I do that in Marimo + +00:11:34.360 --> 00:11:37.760 +nowadays just so you can blend Python with UIs very easily. + +00:11:38.340 --> 00:11:40.460 +And there's demos on the site that people can go ahead and check out. + +00:11:41.299 --> 00:11:44.800 +But that's also the second YouTube channel I made this year. + +00:11:45.000 --> 00:11:46.580 +I do a lot of stuff for Marimo. + +00:11:47.060 --> 00:11:51.720 +I'm a little bit more on the growth side of things than the hardcore engineering side of things. + +00:11:51.740 --> 00:11:55.700 +I still make a lot of plugins for Marimo, of course, but that's a little bit more of what I do nowadays. + +00:11:56.700 --> 00:11:58.800 +Making sure that people learn about Marimo. + +00:12:00.120 --> 00:12:02.360 +You can do things in a notebook now that you couldn't think of before. + +00:12:03.180 --> 00:12:08.800 +I write my command line apps in a notebook nowadays because it's actually not just because I can, but because it's convenient too. + +00:12:09.040 --> 00:12:14.440 +So we'll talk a little bit about that later when we talk about LLMs, but that's effectively the short story of what I've been doing last year. + +00:12:14.660 --> 00:12:17.720 +Nice. All right. I am a fan of Marimo. + +00:12:18.860 --> 00:12:19.480 +Happy to hear it. + +00:12:19.620 --> 00:12:47.340 +Yeah, yeah, yeah. I did a data science course this year, just enough Python and software engineering for data scientists. Like, you know, what could you bring from the software engineering side to like sort of make your, your data engineering, data science stuff a little more reliable. But I was right on the fence of should I, should I use this? Because it's, I think it's clearly better. But at the same time, I also want to use what people are using. So I didn't quite go for it. But I just, the UI is fantastic. + +00:12:47.620 --> 00:12:50.100 +the reproducibility, reliability of it, + +00:12:50.380 --> 00:12:52.320 +where it uses the abstract syntax tree + +00:12:52.500 --> 00:12:53.480 +or concrete, whatever, + +00:12:54.000 --> 00:12:55.940 +to understand relationships across cells + +00:12:56.120 --> 00:12:57.440 +so they don't get out of sync, potentially. + +00:12:57.860 --> 00:12:59.380 +Yeah, I think it's really nice. + +00:12:59.500 --> 00:13:00.560 +I had Akshay on the show, + +00:13:01.360 --> 00:13:02.220 +who you work with also, + +00:13:02.560 --> 00:13:03.220 +to talk about it. + +00:13:03.780 --> 00:13:06.660 +The one thing in Jupyter that's, + +00:13:07.300 --> 00:13:08.720 +I mean, let me start by saying, + +00:13:08.980 --> 00:13:10.360 +Jupyter is still a really cool project. + +00:13:10.520 --> 00:13:12.000 +Like if I think back of the last decade, + +00:13:12.380 --> 00:13:13.820 +like all the good that that project + +00:13:13.880 --> 00:13:14.820 +has brought to my career, + +00:13:14.960 --> 00:13:16.160 +not just directly as a user, + +00:13:16.280 --> 00:13:17.020 +but also indirectly, + +00:13:17.700 --> 00:13:19.160 +all the algorithms that got invented + +00:13:19.350 --> 00:13:20.940 +simply because we had a good enough + +00:13:21.640 --> 00:13:23.240 +interactive environment suddenly, which we never + +00:13:23.400 --> 00:13:24.940 +had before, it's done wonders. + +00:13:25.120 --> 00:13:26.840 +So we should definitely be thankful. + +00:13:27.230 --> 00:13:28.600 +Yeah, I'm not bashing on it either. + +00:13:29.010 --> 00:13:31.200 +Yeah, but I do always want to + +00:13:31.280 --> 00:13:33.120 +make sure that I give credit where credit is due, because + +00:13:33.620 --> 00:13:34.940 +the project had a lot of impact. + +00:13:35.240 --> 00:13:37.000 +But there is this really annoying thing with Jupyter, though. + +00:13:37.220 --> 00:13:39.200 +Besides, you can be a bit grumpy + +00:13:39.270 --> 00:13:41.220 +about the file format, sure, but the one thing that's + +00:13:41.270 --> 00:13:42.960 +very annoying is you can write down + +00:13:43.170 --> 00:13:45.160 +X is equal to 10, and then + +00:13:45.160 --> 00:13:47.040 +have all sorts of cells below it that depend on X, + +00:13:47.420 --> 00:13:50.560 +You could then delete the cell and the notebook will not complain to you about it. + +00:13:51.700 --> 00:13:58.320 +Even though if anyone else tries to rerun the notebook, X is gone, it won't run, and your notebook is broken and you can't share the thing anymore. + +00:13:58.710 --> 00:14:02.680 +And nowadays, fast forward like many years later, and we've got stuff like uv now. + +00:14:02.830 --> 00:14:05.720 +So we can add metadata to a Python file to add dependencies. + +00:14:06.160 --> 00:14:11.100 +And oh, wait, because Marimo is a Python file, we can add dependencies to the notebook that makes it super reproducible. + +00:14:11.480 --> 00:14:17.820 +There's just all this stuff that you can rethink now simply because we have a good notebook format that is still a Python file. + +00:14:18.100 --> 00:14:19.680 +That's really the killer feature here. + +00:14:20.759 --> 00:14:24.720 +We can talk more about it if you like, but I can talk for ages about this topic, just warning you. + +00:14:25.600 --> 00:14:29.140 +One final thing, maybe also for the software engineering side of things, because you didn't mention it. + +00:14:29.560 --> 00:14:31.340 +Just to give an example of something we added recently. + +00:14:31.920 --> 00:14:38.860 +If you have a cell in Marimo and there's a function in there that starts with test underscore, we will automatically assume it's a pytest. + +00:14:39.240 --> 00:14:41.820 +So you can actually add unit tests to your notebook as well. + +00:14:41.940 --> 00:14:44.360 +And then if you say python notebook.py, + +00:14:44.800 --> 00:14:46.940 +then pytest will just run all the tests for you, + +00:14:47.160 --> 00:14:49.840 +even though you can also run the tests in line in your CI CD as well. + +00:14:49.920 --> 00:14:52.720 +So there's all sorts of these Python specific things that we can add, + +00:14:53.100 --> 00:14:54.340 +again, because it's just a Python file. + +00:14:54.620 --> 00:14:57.940 +Yeah, it's sort of the second mover advantage or nth mover advantage + +00:14:58.260 --> 00:15:00.680 +where n is greater than one, where you see, okay, that was awesome. + +00:15:01.000 --> 00:15:02.880 +Maybe this was a little bit of a rough edge. + +00:15:02.960 --> 00:15:05.840 +And what would we do to work around that or smooth it out, right? + +00:15:06.140 --> 00:15:09.600 +Well, and also we're also lucky that we're born in the age of uv. + +00:15:09.880 --> 00:15:14.140 +I got to say like a lot of like quality of life improvements to a notebook. + +00:15:15.180 --> 00:15:17.100 +A lot of that is also due to the fact that uv is around. + +00:15:17.220 --> 00:15:18.600 +That's definitely a huge help as well. + +00:15:20.640 --> 00:15:23.900 +This portion of Talk Python To Me is brought to you by Sentry's Seer. + +00:15:24.600 --> 00:15:27.380 +I'm excited to share a new tool from Sentry, Seer. + +00:15:27.960 --> 00:15:30.740 +Seer is your AI driven pair programmer that finds, + +00:15:31.020 --> 00:15:35.240 +diagnoses and fixes code issues in your Python app faster than ever. + +00:15:35.480 --> 00:15:39.420 +If you're already using Sentry, you are already using Sentry, right? + +00:15:39.940 --> 00:15:44.220 +Then using Seer is as simple as enabling a feature on your already existing project. + +00:15:45.020 --> 00:15:48.360 +Seer taps into all the rich context Sentry has about an error. + +00:15:48.860 --> 00:15:52.920 +Stack traces, logs, commit history, performance data, essentially everything. + +00:15:53.560 --> 00:15:57.640 +Then it employs its agentic AI code capabilities to figure out what is wrong. + +00:15:58.240 --> 00:16:02.040 +It's like having a senior developer pair programming with you on bug fixes. + +00:16:02.920 --> 00:16:08.180 +SEER then proposes a solution, generating a patch for your code and even opening a GitHub pull request. + +00:16:08.630 --> 00:16:13.320 +This leaves the developers in charge because it's up to them to actually approve the PR. + +00:16:13.740 --> 00:16:17.540 +But it can reduce the time from error detection to fix dramatically. + +00:16:18.260 --> 00:16:23.800 +Developers who've tried it found it can fix errors in one shot that would have taken them hours to debug. + +00:16:24.500 --> 00:16:28.900 +SEER boasts a 94.5% accuracy in identifying root causes. + +00:16:29.580 --> 00:16:36.260 +SEER also prioritizes actionable issues with an actionability score, so you know what to fix first. + +00:16:36.760 --> 00:16:43.780 +This transforms Sentry errors into actionable fixes, turning a pile of error reports into an ordered to-do list. + +00:16:44.420 --> 00:16:53.500 +If you could use an always-on-call AI agent to help track down errors and propose fixes before you even have time to read the notification, check out Sentry's SEER. + +00:16:54.140 --> 00:16:58.120 +Just visit talkpython.fm/SEER, S-E-E-R. + +00:16:58.740 --> 00:17:00.620 +The link is in your podcast player show notes. + +00:17:01.220 --> 00:17:03.320 +Be sure to use our code, Talk Python. + +00:17:03.820 --> 00:17:04.699 +One word, all caps. + +00:17:05.380 --> 00:17:07.420 +Thank you, Dysentry, for supporting Talk Pythonemy. + +00:17:08.720 --> 00:17:10.500 +I saw on, gosh, where was it? + +00:17:10.699 --> 00:17:13.760 +Speaking of Reddit, I saw on slash R, + +00:17:14.270 --> 00:17:15.760 +learn Python or slash R Python. + +00:17:16.060 --> 00:17:19.020 +One of the slash R's with a Python substring. + +00:17:19.280 --> 00:17:23.060 +Someone asks, what Python package manager would you use now? + +00:17:23.140 --> 00:17:26.459 +And it's just like, how many times can you say UV? + +00:17:27.819 --> 00:17:29.200 +The feedback comments. + +00:17:29.400 --> 00:17:31.460 +I mean, it was someone new who wasn't unsure, right? + +00:17:31.860 --> 00:17:32.900 +They've seen all the different ones. + +00:17:33.460 --> 00:17:34.100 +And yeah. + +00:17:34.660 --> 00:17:38.160 +I mean, the coolest comparison I've seen, I think it was a tweet someone posted. + +00:17:38.300 --> 00:17:41.360 +But like, definitely, like, suppose you're in the data field right now. + +00:17:41.500 --> 00:17:42.780 +Like, what are the tools 10 years ago? + +00:17:42.980 --> 00:17:43.680 +What are the tools now? + +00:17:43.960 --> 00:17:46.800 +It definitely does feel like, okay, before we had pip, now we have uv. + +00:17:47.300 --> 00:17:49.400 +Before we had Pandas, now we have Polars. + +00:17:50.360 --> 00:17:52.180 +Before we had Matplotlib, now we have Altair. + +00:17:52.240 --> 00:17:53.700 +And before we had Jupyter, now we've got Marino. + +00:17:53.780 --> 00:17:57.000 +You can kind of see a generational shift, not just on the notebook side of things, + +00:17:57.100 --> 00:17:59.500 +but like on the package manager, on the data frame library, + +00:18:00.280 --> 00:18:01.980 +we're all learning from the previous generation + +00:18:02.200 --> 00:18:03.180 +and it kind of feels like. + +00:18:03.300 --> 00:18:04.000 +Yeah, absolutely. + +00:18:04.380 --> 00:18:05.820 +It's an amazing time. + +00:18:06.400 --> 00:18:08.620 +Every single day in Python is just more exciting + +00:18:08.700 --> 00:18:09.280 +than the day before. + +00:18:09.620 --> 00:18:09.780 +Yeah. + +00:18:10.320 --> 00:18:12.460 +Although I should also mention, especially with the Polars + +00:18:12.500 --> 00:18:13.740 +thing, there's a fair bit of rust too. + +00:18:14.420 --> 00:18:16.020 +That also makes a difference. + +00:18:16.440 --> 00:18:17.560 +Yeah, yeah, of course. + +00:18:17.760 --> 00:18:22.280 +So before Flyby, I mean, you talked a bit about your YouTube + +00:18:22.880 --> 00:18:25.100 +channel, you review the economic keyboards. + +00:18:25.420 --> 00:18:29.680 +you have for those people who are just listening, not watching your background is super nice. + +00:18:30.320 --> 00:18:35.280 +Your bona fides are solid here, but you've got a bunch of different ones. And I personally also, + +00:18:35.700 --> 00:18:38.400 +I don't know, you might even just kick me out of the club if I should do that. + +00:18:38.480 --> 00:18:39.880 +That's the Microsoft Sculpt, right? + +00:18:40.600 --> 00:18:44.800 +Microsoft Sculpt ergonomic. And I like it so much because I can take this little bottom thing off + +00:18:44.800 --> 00:18:49.140 +and it's like razor thin if I could rotate and reverse and jam it in my backpack and take it + +00:18:49.240 --> 00:18:55.380 +with me. But I've also had RSI issues for 25 years or something. And it was really bad. And I switched + +00:18:55.400 --> 00:18:56.760 +I used to type them on like laptops and stuff. + +00:18:56.840 --> 00:19:01.080 +And I switched to just something like this and I could type 10 hours a day and + +00:19:01.120 --> 00:19:02.360 +there's no problem. Whereas before, if I, + +00:19:02.400 --> 00:19:05.220 +if I were to force to type full on, on a laptop, + +00:19:05.460 --> 00:19:08.980 +I would probably be like unable to work in a week. If not a week. + +00:19:09.420 --> 00:19:12.980 +So we had an offsite in San Francisco. So I was there like two weeks ago, + +00:19:13.460 --> 00:19:16.080 +a lot of fun, but like, I'm not going to bring like, + +00:19:16.860 --> 00:19:18.680 +airport security is going to look at this and wonder what, + +00:19:19.560 --> 00:19:22.640 +like what kind of weird alien device this is. Right. So I figured, okay, + +00:19:22.780 --> 00:19:25.180 +I'll leave those at home and just bring my normal laptop. And after a week, + +00:19:25.220 --> 00:19:26.660 +I'm feeling it in my wrist again. + +00:19:27.020 --> 00:19:27.520 +It's not good. + +00:19:27.940 --> 00:19:29.240 +Yeah, it's something that takes serious. + +00:19:29.360 --> 00:19:31.400 +Although I will also say like + +00:19:32.600 --> 00:19:34.400 +because the kid was sick and the wife was sick + +00:19:34.460 --> 00:19:36.200 +so I wasn't able to do a whole lot of exercise at home. + +00:19:37.040 --> 00:19:37.620 +Also do exercise. + +00:19:37.880 --> 00:19:39.220 +That also totally helps. + +00:19:39.300 --> 00:19:39.900 +If I had to like, + +00:19:40.920 --> 00:19:42.660 +sure, buy an ergonomic keyboard if you like. + +00:19:42.740 --> 00:19:43.760 +Programmatic keyboards, they're great. + +00:19:44.120 --> 00:19:45.240 +But if you're going to do that as an excuse + +00:19:45.380 --> 00:19:46.800 +not to do exercise, you're doing it wrong. + +00:19:47.120 --> 00:19:48.860 +That's the one thing I do want to add to that. + +00:19:48.960 --> 00:19:50.220 +There's lots of stretches you can do. + +00:19:50.460 --> 00:19:51.160 +Taking breaks matters. + +00:19:51.660 --> 00:19:52.300 +Exercise matters. + +00:19:53.020 --> 00:19:56.040 +But I think those keyboards are an essential feature. + +00:19:56.520 --> 00:19:58.560 +They do totally help. + +00:19:58.600 --> 00:20:01.140 +And also, sometimes it's not so much a healing measure. + +00:20:01.340 --> 00:20:02.400 +It's more of a preventative measure. + +00:20:02.760 --> 00:20:03.740 +Yes, 100%. + +00:20:04.040 --> 00:20:09.520 +And I would like to put this message out just to everyone listening who is young, absolutely indestructible. + +00:20:11.360 --> 00:20:11.840 +Please. + +00:20:12.420 --> 00:20:15.520 +I know they're a bit of a pain in the butt with the weird curviness. + +00:20:15.900 --> 00:20:20.540 +It is so worth it to just say, I've never had any RSI issues. + +00:20:21.060 --> 00:20:22.980 +I just use this weird keyboard and people make fun of me, + +00:20:23.010 --> 00:20:23.560 +but I don't care. + +00:20:23.740 --> 00:20:26.840 +You know, that is a better thing than like, I'm working on it. + +00:20:27.220 --> 00:20:27.420 +Yeah. + +00:20:27.660 --> 00:20:28.140 +Oh, my. + +00:20:28.140 --> 00:20:28.420 +It hurts. + +00:20:28.610 --> 00:20:28.880 +You know? + +00:20:29.400 --> 00:20:31.600 +Well, another thing is also, especially now + +00:20:31.640 --> 00:20:32.840 +we've got stuff like Claude, right? + +00:20:32.840 --> 00:20:35.240 +I just want-- if I can point out like one thing on that front. + +00:20:36.180 --> 00:20:37.880 +So these ergonomic keyboards, you can often + +00:20:38.190 --> 00:20:39.460 +program them as you see fit. + +00:20:39.620 --> 00:20:41.880 +So you can say, like, if I hit this key, it's K. + +00:20:42.010 --> 00:20:43.740 +But if I hold it down, it becomes command, + +00:20:43.960 --> 00:20:44.920 +or whatever you like. + +00:20:45.180 --> 00:20:47.000 +But it also means that you can map macros or shortcuts. + +00:20:47.280 --> 00:20:51.020 +So whenever I hit this button, an app called MacWhisperer boots + +00:20:51.040 --> 00:20:51.720 +That's going to record my voice. + +00:20:51.720 --> 00:20:52.680 +I love Mac Whisper. + +00:20:53.140 --> 00:20:53.240 +Yeah. + +00:20:53.620 --> 00:20:55.360 +And then there's alternative variants for it. + +00:20:55.380 --> 00:20:56.800 +I'm sure you've got stuff for Linux as well. + +00:20:56.880 --> 00:20:59.280 +But the main thing it does is just really good speech to text. + +00:20:59.740 --> 00:21:01.680 +And then whenever I'm done talking, + +00:21:01.840 --> 00:21:03.800 +it's just immediately going to paste whatever I'm in. + +00:21:04.260 --> 00:21:06.240 +So, oh, that's actually very nice, + +00:21:06.420 --> 00:21:08.700 +because now the keyboard shortcut is just me holding my thumb down + +00:21:08.840 --> 00:21:14.420 +instead of some sort of weird claw I got to eject to my keyboard. + +00:21:15.000 --> 00:21:17.780 +And suddenly, it doesn't necessarily become a convenience thing. + +00:21:17.880 --> 00:21:20.100 +It also becomes kind of a power user thing. + +00:21:20.240 --> 00:21:21.960 +Like you can really customize your computer experience + +00:21:22.350 --> 00:21:23.980 +if you can really map everything to your keyboard + +00:21:24.500 --> 00:21:25.740 +just the way that you like. + +00:21:25.810 --> 00:21:28.280 +It's like having Vim, but for all the apps out there. + +00:21:28.540 --> 00:21:29.560 +Okay, that's very neat. + +00:21:29.830 --> 00:21:33.680 +Yeah, I've remapped Caps Lock to my Activate Mac Whisper. + +00:21:34.000 --> 00:21:35.520 +So if I hit Caps Lock and hold this down, + +00:21:35.610 --> 00:21:36.300 +then I can dictate. + +00:21:36.670 --> 00:21:38.520 +And I know computers have had dictation for a long time, + +00:21:38.630 --> 00:21:40.180 +but it's been really bad, right? + +00:21:40.380 --> 00:21:43.960 +The built-in dictation to macOS or Windows isn't great, + +00:21:44.040 --> 00:21:46.220 +especially when you try to talk acronyms like, + +00:21:46.340 --> 00:21:47.840 +hey, do PyPI, and it's like... + +00:21:47.840 --> 00:21:48.580 +Yeah, Mac Whisper. + +00:21:48.700 --> 00:21:51.280 +No, but Mac Whisper, it's not like all the way there, + +00:21:51.420 --> 00:21:55.440 +but I will say it surprised me in quality a few times, definitely. + +00:21:55.920 --> 00:21:56.480 +Yeah, yeah. + +00:21:56.660 --> 00:21:58.520 +And that app, you can go in and actually set replacements. + +00:21:58.800 --> 00:22:02.160 +Like if you ever think you're going to write this, write that instead. + +00:22:02.700 --> 00:22:03.920 +So I've done that with like Talk Python + +00:22:04.120 --> 00:22:07.840 +because it always just turns it into a camel case combined. + +00:22:08.200 --> 00:22:09.900 +I'm like, no, those are two separate words. + +00:22:10.060 --> 00:22:10.540 +You know, whatever. + +00:22:10.740 --> 00:22:11.980 +You can sort of customize it a bit. + +00:22:12.240 --> 00:22:14.460 +Yeah, and I think my favorite one, + +00:22:14.540 --> 00:22:16.760 +I think Mac Whisper actually gets this one wrong still, + +00:22:17.000 --> 00:22:22.480 +But whenever I write scikit-learn, which is a very popular data science package, it always + +00:22:22.740 --> 00:22:25.260 +translates it to psychologists learn. + +00:22:26.570 --> 00:22:27.140 +Which, you know. + +00:22:30.280 --> 00:22:33.700 +I never want you to ever write this series of words. + +00:22:33.840 --> 00:22:36.640 +Because if I have to, I'll just type it all in time. + +00:22:36.780 --> 00:22:38.080 +It's not my common thing. + +00:22:39.480 --> 00:22:43.240 +It's one of these moments where actually I type scikit-learn often enough where it's + +00:22:43.270 --> 00:22:44.420 +like almost becoming the issue. + +00:22:44.660 --> 00:22:46.800 +So I'm like at the verge of adding these rules manually + +00:22:47.040 --> 00:22:48.300 +for all these weird open source packages + +00:22:48.410 --> 00:22:49.720 +that I interact with. + +00:22:49.880 --> 00:22:52.620 +But yeah, it's take ergonomics serious people. + +00:22:52.800 --> 00:22:53.660 +That's the one thing I wanna say. + +00:22:53.690 --> 00:22:55.560 +You don't always have to buy a super expensive keyboard. + +00:22:56.780 --> 00:22:58.900 +The, if you wanna explore like programmatic keyboards + +00:22:59.180 --> 00:23:01.040 +because you can customize things, that's an excellent reason. + +00:23:01.600 --> 00:23:03.860 +But like take a break and do exercises and just, + +00:23:03.960 --> 00:23:05.040 +you know, be healthy. + +00:23:05.280 --> 00:23:07.200 +That's the, that's the, that's how you win a marathon. + +00:23:07.580 --> 00:23:08.360 +- Yes, that's for sure. + +00:23:08.780 --> 00:23:12.200 +Now this is not the Vincent's keyboard review marathon, + +00:23:12.680 --> 00:23:16.080 +But let's wrap it up with, if you could take any keyboard + +00:23:17.340 --> 00:23:19.480 +hanging around on your wall there, which one would you use? + +00:23:19.660 --> 00:23:22.800 +- Well, so there's four pairs of these, + +00:23:22.920 --> 00:23:23.680 +so it's probably this one. + +00:23:25.179 --> 00:23:27.140 +So there's a couple of other boards that are great too. + +00:23:27.480 --> 00:23:29.300 +This board is not the board I would recommend to everyone, + +00:23:29.720 --> 00:23:31.200 +but if you have serious RSI issues, + +00:23:31.320 --> 00:23:33.420 +I do think the GloVe 80 is your best bet. + +00:23:34.140 --> 00:23:35.680 +It's simple, in terms of the shape, + +00:23:36.040 --> 00:23:38.600 +it is probably the most ergonomic shape for most hand sizes. + +00:23:39.000 --> 00:23:40.200 +- Yeah, okay, awesome. + +00:23:40.960 --> 00:23:44.600 +All right, well, let's switch over to talking about programming. + +00:23:45.140 --> 00:23:45.360 +Yes. + +00:23:45.600 --> 00:23:48.460 +LLMs with LLMs, not programming LLMs. + +00:23:48.860 --> 00:23:52.100 +And I guess, like you already called out, you wrote this course called + +00:23:52.540 --> 00:23:54.080 +LLM building blocks for Python. + +00:23:54.730 --> 00:23:57.540 +Super fun course, really neat, it's pretty short and concise. + +00:23:57.930 --> 00:24:03.140 +And it really talks about how can you reliably add some kind of LLM into your code. + +00:24:04.200 --> 00:24:08.219 +I guess what you're talking about in this course really applies regardless of + +00:24:08.240 --> 00:24:12.200 +whether it's a self-hosted one or it's open AI or Anthropic, right? + +00:24:12.230 --> 00:24:13.180 +You can have, there's some, + +00:24:13.450 --> 00:24:17.320 +some choices you can make on which LLM to use, right? + +00:24:17.660 --> 00:24:21.400 +So the, like the main idea I had with the course was like, + +00:24:22.260 --> 00:24:24.540 +an LLM is a building block at some point, + +00:24:24.570 --> 00:24:27.720 +but it's very unlike a normal building block when you're dealing with code, + +00:24:27.920 --> 00:24:29.180 +because normally with code, + +00:24:29.210 --> 00:24:31.960 +you put something into a function and like one thing comes out. + +00:24:32.170 --> 00:24:34.240 +But in this particular case, you put a thing into a function, + +00:24:34.290 --> 00:24:36.240 +you have no idea upfront what's going to come out. + +00:24:36.760 --> 00:24:38.760 +And not only that, but you put the same thing in twice + +00:24:38.860 --> 00:24:40.120 +and something else might come out as well. + +00:24:40.500 --> 00:24:43.980 +So that means that you're going to want to think about this tool + +00:24:44.060 --> 00:24:45.220 +a bit more defensively. + +00:24:45.320 --> 00:24:46.220 +It's a weird building block. + +00:24:46.400 --> 00:24:48.120 +It's like you have to put a moat around it + +00:24:48.180 --> 00:24:49.800 +because otherwise the building block is going to do stuff + +00:24:49.800 --> 00:24:51.380 +you don't want it to do, almost. + +00:24:52.700 --> 00:24:53.780 +And some of that is syntax. + +00:24:53.900 --> 00:24:55.780 +Some of that is how do you think about these Lego bricks. + +00:24:56.260 --> 00:24:58.880 +And some of it is also just what is good methodology in general + +00:24:59.360 --> 00:25:03.140 +to statistically test if the thing is doing roughly what you want it to do. + +00:25:03.340 --> 00:25:03.540 +Nice. + +00:25:04.400 --> 00:25:05.260 +That's the gist of it. + +00:25:06.040 --> 00:25:06.500 +Yeah, cool. + +00:25:07.100 --> 00:25:10.280 +Yeah, I pulled out a couple of ideas, concepts, and tools + +00:25:10.480 --> 00:25:11.700 +that you talked about throughout the course. + +00:25:12.200 --> 00:25:13.580 +And you don't have to have taken the course + +00:25:13.800 --> 00:25:15.020 +to have these things be useful. + +00:25:15.240 --> 00:25:18.540 +I just thought it might be fun to riff on some of the things + +00:25:18.660 --> 00:25:19.380 +you touched on here. + +00:25:20.140 --> 00:25:21.980 +The main thing I think will be fun is-- + +00:25:22.010 --> 00:25:23.420 +it's been half a year, I think. + +00:25:23.720 --> 00:25:25.480 +It will be fun how much of it is still intact. + +00:25:25.590 --> 00:25:27.220 +I think most of it still definitely is. + +00:25:27.740 --> 00:25:29.600 +But it might be fun to see if we can find anything + +00:25:29.760 --> 00:25:32.900 +that might be dated, just to see if the world has moved on + +00:25:33.140 --> 00:25:33.300 +quickly. + +00:25:33.760 --> 00:25:36.540 +I think there's only one thing, but I'm just kind of curious. + +00:25:36.740 --> 00:25:37.300 +Let's see. + +00:25:37.380 --> 00:25:37.840 +Yeah, all right. + +00:25:37.840 --> 00:25:40.600 +Well, let's keep our radar up for that. + +00:25:40.680 --> 00:25:43.140 +It's definitely something that's more changing quicker + +00:25:43.710 --> 00:25:45.700 +and has a higher likelihood of being dated. + +00:25:46.100 --> 00:25:47.360 +But I think it holds up pretty well. + +00:25:47.640 --> 00:25:48.180 +Yeah, okay. + +00:25:49.440 --> 00:25:53.140 +One of the things I remember emphasizing is you want to do some stuff like caching. + +00:25:53.420 --> 00:25:56.220 +So let's say you've got a function and you use an LLM for it. + +00:25:56.260 --> 00:25:58.220 +And let's keep it simple. + +00:25:58.340 --> 00:25:59.420 +Let's say we're just making summaries. + +00:25:59.620 --> 00:26:03.520 +So talk Python episode paragraph goes in, single sentence is supposed to come out. + +00:26:04.420 --> 00:26:04.980 +Something like that. + +00:26:05.860 --> 00:26:13.020 +Okay, well, you might have a loop and you're going to do maybe one pass, try one LLM with + +00:26:13.200 --> 00:26:16.400 +one type of setting, try another LLM with different type of settings to generate all + +00:26:16.500 --> 00:26:16.780 +this data. + +00:26:17.120 --> 00:26:21.820 +It would be a shame that you're going to use an LLM, which is kind of an expensive compute + +00:26:22.080 --> 00:26:25.580 +thing, if you put the same input in by accident and then you incur the cost twice. + +00:26:26.020 --> 00:26:27.020 +That would really stink. + +00:26:27.520 --> 00:26:30.360 +So one of the things you always want to do is think a little bit about caching. + +00:26:31.040 --> 00:26:34.020 +And there's a Python library called disk cache that I've always loved to use. + +00:26:34.040 --> 00:26:35.680 +And I highly recommend people have a look at it. + +00:26:36.060 --> 00:26:38.380 +I think Michael, you've also used it in one of your courses before. + +00:26:39.200 --> 00:26:40.640 +The trick is we have to talk about this. + +00:26:40.740 --> 00:26:41.420 +It's so good. + +00:26:41.620 --> 00:26:42.380 +It is so good. + +00:26:42.900 --> 00:26:44.980 +It is SQLite and is so good. + +00:26:45.160 --> 00:26:46.980 +It is even better than SQLite. + +00:26:47.100 --> 00:26:48.520 +It is unbelievably good. + +00:26:48.680 --> 00:26:51.660 +And I have you to think I knew about it, but yeah, whatever. + +00:26:52.040 --> 00:26:53.920 +And then after I saw you use it, I'm like, genius. + +00:26:54.520 --> 00:26:55.320 +It is genius. + +00:26:56.040 --> 00:26:56.100 +Yes. + +00:26:56.440 --> 00:27:00.440 +No, so it's like having the Eliru disk cache. + +00:27:00.820 --> 00:27:01.700 +But it's also on disk. + +00:27:01.830 --> 00:27:05.360 +So if you were to restart Python, you still have everything in cache basically. + +00:27:05.800 --> 00:27:10.080 +And it's a SQLite database, so you can always inspect all the stuff that's in there. + +00:27:10.440 --> 00:27:16.440 +If you wanted to, you can also do fancy things like add time to live to every single object. + +00:27:16.700 --> 00:27:18.860 +And this is something you could do in a Docker container for a web app. + +00:27:19.570 --> 00:27:23.500 +But the main thing that's always nice when you're dealing with LLMs is you always want + +00:27:23.500 --> 00:27:27.160 +to be able to say in hindsight, like, okay, how did this LLM compare to that one? + +00:27:27.300 --> 00:27:28.140 +You want to compare outputs. + +00:27:28.580 --> 00:27:31.920 +then just writing a little bit of a decorator on top of a function + +00:27:32.180 --> 00:27:34.960 +is the way to put it in SQLite, and you're just done with that concern. + +00:27:35.600 --> 00:27:37.340 +That is just amazing. + +00:27:37.660 --> 00:27:39.760 +And we're using this cache directly. + +00:27:40.140 --> 00:27:43.080 +If you're using LLM by Simon Willison from the command line, + +00:27:43.300 --> 00:27:46.200 +there's also a mechanism there so that you can get it into SQLite + +00:27:46.230 --> 00:27:46.780 +if you wanted to. + +00:27:46.940 --> 00:27:48.420 +So that's also a feature you could consider. + +00:27:49.270 --> 00:27:49.360 +But-- + +00:27:49.360 --> 00:27:50.940 +MARK MANDEL: Of course, he's going to put something in SQLite. + +00:27:51.100 --> 00:27:53.140 +Like, he can write a library that doesn't put something in SQLite, + +00:27:53.290 --> 00:27:54.840 +given his data set project. + +00:27:55.070 --> 00:27:55.900 +MARK MANDEL: It's Simon Willison. + +00:27:56.080 --> 00:27:58.000 +He'll put SQLite in SQLite if it's a Sunday. + +00:27:58.360 --> 00:27:58.600 +Yeah. + +00:28:01.440 --> 00:28:03.600 +But if you haven't used Discash before, + +00:28:03.720 --> 00:28:06.240 +it definitely feels like one of these libraries + +00:28:06.500 --> 00:28:07.860 +that because I have it in my back pocket, + +00:28:07.960 --> 00:28:09.840 +it just feels like I can tackle more problems. + +00:28:10.160 --> 00:28:11.560 +That's the short story of it. + +00:28:11.700 --> 00:28:15.360 +And again, the course uses it in a very sort of basic fashion, + +00:28:15.620 --> 00:28:18.360 +but knowing that everything you do in an LLM + +00:28:19.820 --> 00:28:20.720 +only needs to happen once, + +00:28:20.880 --> 00:28:22.060 +if you're interested in using it once, + +00:28:22.640 --> 00:28:23.980 +that just saves you so much money. + +00:28:24.260 --> 00:28:25.380 +Yeah, and it's so useful. + +00:28:25.720 --> 00:28:27.840 +So one of the things you did in the course is you said, + +00:28:27.920 --> 00:28:35.640 +All right, the key for the value that we're going to store is going to be the model, the model settings and the prompt as a tuple, right? + +00:28:35.780 --> 00:28:36.640 +Something along those lines. + +00:28:36.820 --> 00:28:38.360 +And then you use that as the key. + +00:28:38.390 --> 00:28:41.220 +So if any of those variations change, does the model change? + +00:28:41.640 --> 00:28:42.360 +Do the settings change? + +00:28:42.490 --> 00:28:44.940 +Or like anything, that's a totally different request. + +00:28:45.070 --> 00:28:46.280 +And then you just store the response. + +00:28:46.590 --> 00:28:56.280 +And then boom, if you ask the exact same question of the same model with the same prompt with the same settings, why do you need to go and wait 10 seconds and burn money and environmental badness? + +00:28:56.660 --> 00:28:58.740 +when you could literally within a microsecond + +00:28:59.040 --> 00:28:59.560 +get the answer back. + +00:29:00.120 --> 00:29:01.340 +Yeah, and there's also a fancy thing. + +00:29:01.420 --> 00:29:02.760 +It's a trick you can do on top. + +00:29:02.820 --> 00:29:04.240 +So sometimes you want to say, + +00:29:04.400 --> 00:29:06.700 +well, there's a statistical thing also happening. + +00:29:06.820 --> 00:29:08.160 +So sometimes I want to have one input + +00:29:08.380 --> 00:29:09.720 +and actually store maybe 10 outputs + +00:29:10.320 --> 00:29:11.300 +so I can look at all these outputs, + +00:29:11.500 --> 00:29:12.700 +maybe annotate that later. + +00:29:13.120 --> 00:29:14.040 +And the way you solve that + +00:29:14.080 --> 00:29:16.940 +is you just add an integer to the tuple, basically. + +00:29:17.400 --> 00:29:18.940 +And then you're also able to store many outputs + +00:29:19.140 --> 00:29:19.900 +if you really like. + +00:29:20.700 --> 00:29:21.640 +The sky's the limit. + +00:29:21.640 --> 00:29:22.660 +The fourth one, the eighth one, whatever. + +00:29:23.400 --> 00:29:24.940 +Yeah, that's flexible. + +00:29:25.800 --> 00:29:25.880 +Yeah. + +00:29:26.200 --> 00:29:29.500 +And it does a whole bunch of neat, neat things that are really, really wild. + +00:29:29.720 --> 00:29:32.520 +Like, so it looks like just, Hey, I put this value into it. + +00:29:32.620 --> 00:29:32.720 +Right. + +00:29:32.720 --> 00:29:33.320 +And it stores it. + +00:29:33.520 --> 00:29:37.420 +It's really powerful because across application executions. + +00:29:37.900 --> 00:29:40.700 +So like maybe if you're caching that response in your notebook and what are + +00:29:40.700 --> 00:29:44.120 +you doing some testing, if you come back later, you've start the notebook back up + +00:29:44.200 --> 00:29:46.960 +or restart the kernel or whatever, like it's not like an LRU cache. + +00:29:47.120 --> 00:29:51.540 +It remembers cause it's stored somewhere in like temporary storage in a local + +00:29:51.880 --> 00:29:53.300 +SQLite file, which is amazing. + +00:29:53.720 --> 00:29:55.900 +It also has interesting ideas. + +00:29:56.260 --> 00:29:57.540 +I'm not sure really where they are, + +00:29:58.060 --> 00:30:00.840 +but it has different kinds of caches as well. + +00:30:00.940 --> 00:30:03.040 +So maybe you're storing a ton of stuff in there. + +00:30:03.260 --> 00:30:05.680 +And so it'll do basically built-in sharding. + +00:30:06.020 --> 00:30:06.600 +Oh, yeah. + +00:30:06.880 --> 00:30:07.980 +Multiple SQLite files. + +00:30:08.500 --> 00:30:09.980 +And it's really, really good. + +00:30:10.540 --> 00:30:11.520 +This is a deep library. + +00:30:11.700 --> 00:30:13.340 +This is not just, oh, yeah. + +00:30:13.620 --> 00:30:14.980 +It's like LRU, but to desk. + +00:30:15.240 --> 00:30:18.100 +So the cool thing about that library is it really does go deep. + +00:30:18.600 --> 00:30:21.580 +But if you really just want to use it as a dictionary, you can. + +00:30:22.080 --> 00:30:23.540 +That's the thing that I really love about it. + +00:30:23.640 --> 00:30:26.760 +But for all intents and purposes, you just treat it as a dictionary, and you're good. + +00:30:26.980 --> 00:30:29.420 +Or use it as a decorator on a function, and again, you're good. + +00:30:31.940 --> 00:30:36.080 +So again, it's one of those little hacks where, oh, if you just know about the library, + +00:30:36.220 --> 00:30:39.100 +you just become more productive at typical Python stuff. + +00:30:39.320 --> 00:30:39.600 +Yeah. + +00:30:39.880 --> 00:30:41.460 +I'll give you a place where I'm using it, actually. + +00:30:41.620 --> 00:30:46.060 +I use it on some LLM stuff, like where I'm programming against an LLM, for exactly the + +00:30:46.120 --> 00:30:46.580 +reason you did. + +00:30:46.720 --> 00:30:49.200 +Because if you have the same inputs, don't ask the question again. + +00:30:49.540 --> 00:30:50.380 +You just hear the answer. + +00:30:50.540 --> 00:30:51.200 +It's real, real fast. + +00:30:51.460 --> 00:30:54.680 +But if you go over to like Talk Python, let's see here. + +00:30:55.160 --> 00:30:56.780 +Go over to the guests, for example, right? + +00:30:56.900 --> 00:30:57.740 +So there's a bunch of guests. + +00:30:58.030 --> 00:30:58.800 +And here we have Vincent. + +00:30:59.210 --> 00:30:59.740 +Not that Vincent. + +00:31:00.160 --> 00:31:00.600 +Not that Vincent. + +00:31:01.000 --> 00:31:01.740 +There you are, that Vincent. + +00:31:01.740 --> 00:31:02.380 +All these Vincents. + +00:31:04.100 --> 00:31:05.000 +They're all over the... + +00:31:05.500 --> 00:31:07.420 +I have like 560 guests or something. + +00:31:07.520 --> 00:31:07.840 +There's a lot. + +00:31:08.340 --> 00:31:12.840 +But in here, you'll notice that if you go into the view of the source on this thing, + +00:31:13.090 --> 00:31:16.120 +like all over the place, anytime there's a picture, + +00:31:16.380 --> 00:31:19.560 +it'll have this little cache-busting ID on the end. + +00:31:20.480 --> 00:31:22.140 +And that's fine when it's served locally + +00:31:22.230 --> 00:31:23.200 +because you can just look at the file + +00:31:23.200 --> 00:31:25.280 +and go just the file still with the same cache at startup. + +00:31:25.570 --> 00:31:28.600 +But if it comes from like an S3 blob storage, + +00:31:29.190 --> 00:31:30.300 +you know, and the app restarts, + +00:31:30.630 --> 00:31:31.760 +how do I know what that is? + +00:31:31.880 --> 00:31:32.980 +Like it has to go, + +00:31:33.800 --> 00:31:37.000 +it would have to re-download the entire content of the blob. + +00:31:37.980 --> 00:31:38.860 +So check this out. + +00:31:39.220 --> 00:31:39.440 +Yeah. + +00:31:39.540 --> 00:31:40.260 +So I just added. + +00:31:40.800 --> 00:31:41.640 +Yeah, yeah, yeah. + +00:31:41.820 --> 00:31:44.540 +So this feels like there's something between the proper CDN + +00:31:44.680 --> 00:31:45.860 +and then getting it from S3. + +00:31:45.990 --> 00:31:46.900 +Like there are these moments + +00:31:46.950 --> 00:31:48.340 +when you want to have something that's kind of in between. + +00:31:48.900 --> 00:31:51.240 +And then disk cache could actually be a good solution for it. + +00:31:51.480 --> 00:31:51.780 +Exactly. + +00:31:52.140 --> 00:31:54.460 +So what the site does is it has a disk cache. + +00:31:54.560 --> 00:31:58.460 +And anytime it says, hey, I want to refer to a resource that's external, + +00:31:59.100 --> 00:32:01.340 +it'll download it once, compute the hash, + +00:32:01.420 --> 00:32:05.800 +and then store it in the disk cache unless you change something behind it. + +00:32:05.800 --> 00:32:09.080 +And so it's automatically using this, and it makes everything like, + +00:32:09.340 --> 00:32:12.140 +there's never stale resources, and it's instantly fast, + +00:32:12.240 --> 00:32:15.300 +even if they're served out of something remote like S3 equivalent. + +00:32:15.600 --> 00:32:18.520 +And do you also do a time to live? + +00:32:19.260 --> 00:32:23.780 +Is it also the time to live is also like every day it's allowed to refresh once or something like that, I suppose? + +00:32:24.820 --> 00:32:32.320 +Yeah, for the S3 stuff, I don't because I've set up all the admin functions that if I ever change one through the admin, it deletes it out of the cache. + +00:32:32.640 --> 00:32:32.840 +Gotcha. + +00:32:33.240 --> 00:32:34.940 +So it's like it's internally consistent. + +00:32:35.560 --> 00:32:46.020 +But for like other things, like if it parses something out of the, say, the description, which is set in the dictionary, that stuff, it's just got a time to live of like a day or something. + +00:32:46.090 --> 00:32:47.560 +And it's got like, there's a bunch of those. + +00:32:47.760 --> 00:32:48.960 +I'm using all these different places. + +00:32:49.660 --> 00:32:50.460 +And wow, it's so good. + +00:32:50.580 --> 00:32:51.840 +I just wanted to say thank you, because I + +00:32:51.840 --> 00:32:52.820 +knew we were going to talk about it today, + +00:32:53.240 --> 00:32:53.980 +because this is so good. + +00:32:54.100 --> 00:32:54.460 +Yeah, I know. + +00:32:57.380 --> 00:32:59.080 +If it was part of the standard library, + +00:32:59.320 --> 00:33:00.760 +I would honestly not be surprised. + +00:33:01.320 --> 00:33:03.020 +That's also the story with that. + +00:33:03.140 --> 00:33:04.340 +But yeah, SQLite is great. + +00:33:04.600 --> 00:33:05.140 +DiskCache is great. + +00:33:06.460 --> 00:33:08.540 +It feels like a dictionary, but it gives you so much more. + +00:33:08.840 --> 00:33:09.100 +It's great. + +00:33:09.380 --> 00:33:10.800 +That's the only thing I can say about it. + +00:33:10.960 --> 00:33:12.360 +So people are probably like, wait, + +00:33:12.410 --> 00:33:13.620 +I thought we were talking about LLMs. + +00:33:14.330 --> 00:33:14.720 +Yeah, we are. + +00:33:14.720 --> 00:33:15.860 +I think this is one of the interesting things, + +00:33:16.000 --> 00:33:20.560 +It's because there's all these interesting tools and it's not even about using agentic AI. + +00:33:20.810 --> 00:33:27.980 +And it's not like there's really cool libraries and tools that you can just apply to this LLM problem, but also apply everywhere else. + +00:33:28.130 --> 00:33:28.220 +Right. + +00:33:29.060 --> 00:33:29.800 +It's one of those. + +00:33:30.380 --> 00:33:33.740 +So it's one thing I have found with like LLMs in general if you're building software with it. + +00:33:34.000 --> 00:33:40.480 +On the one end, I think it can be very helpful if you're like a proper like senior engineer kind of a person because then you know about things like, oh, I want a cache. + +00:33:40.880 --> 00:33:41.780 +And what's the pragmatic cache? + +00:33:41.890 --> 00:33:43.400 +And you can also pick Redis, by the way. + +00:33:43.520 --> 00:33:44.480 +If you would have used Redis for this, + +00:33:44.560 --> 00:33:45.260 +that could have also worked. + +00:33:45.400 --> 00:33:45.860 +Just that I think-- + +00:33:45.880 --> 00:33:46.580 +- Sure, it would have been fine. + +00:33:46.960 --> 00:33:47.600 +- Yeah, it would have been fine. + +00:33:47.680 --> 00:33:49.380 +- But you know what I like is there's no servers. + +00:33:49.840 --> 00:33:50.940 +I don't have to deal with the servers. + +00:33:51.260 --> 00:33:53.200 +Like, it's just, it's a file. + +00:33:53.780 --> 00:33:54.420 +- It's very true. + +00:33:54.520 --> 00:33:55.480 +- And it just keeps it simpler. + +00:33:55.900 --> 00:33:56.560 +- It's totally true. + +00:33:56.840 --> 00:33:58.260 +But the point I wanted to get out here, + +00:33:58.320 --> 00:34:00.940 +like your previous experience as a proper engineer + +00:34:01.300 --> 00:34:03.560 +will still help you write good LLM software. + +00:34:03.980 --> 00:34:05.980 +However, from a similar perspective, + +00:34:06.060 --> 00:34:07.380 +I also think that we do have this generation + +00:34:07.840 --> 00:34:08.659 +of like data scientists, + +00:34:08.899 --> 00:34:11.220 +and maybe data engineer, data analyst kind of person, + +00:34:11.580 --> 00:34:14.860 +like thinking analytically being quite critical of the output of an algorithm. + +00:34:15.399 --> 00:34:19.340 +That's also like a good bone to have in this day and age, I would say. + +00:34:20.100 --> 00:34:24.720 +Because if I've built a couple of recommenders back in my day, like it's been a decade now, + +00:34:24.879 --> 00:34:28.560 +but like one of the things you do learn when you're building a recommender is that you're stuck with this problem of, + +00:34:29.000 --> 00:34:32.320 +hey, I gave this user this recommendation and they clicked it. + +00:34:32.679 --> 00:34:35.820 +Would they have clicked this other thing if I would have recommended it to them? + +00:34:36.260 --> 00:34:40.200 +And suddenly you're dealing with a system that's like really stochastic and hard to predict. + +00:34:40.500 --> 00:34:44.360 +and you have to be kind of strict about the way you test and compare these different algorithms. + +00:34:44.629 --> 00:34:47.280 +And you want to think twice about the way you A-B test these things. + +00:34:47.389 --> 00:34:50.200 +And, oh, actually, just like disk cache is useful as a tool, + +00:34:50.580 --> 00:34:55.540 +having a little bit of methodology statistically in your mind will also help you + +00:34:55.690 --> 00:35:01.620 +because comparing LLMs, a lot of it is doing evaluations, being kind of strict about that. + +00:35:01.630 --> 00:35:03.020 +And that's also what I try to do in the course. + +00:35:03.110 --> 00:35:05.540 +I try to just show you that if you're really strict about evaluations, + +00:35:06.400 --> 00:35:09.719 +then you can also learn that for some problems, you're still better off using scikit-learn + +00:35:09.740 --> 00:35:13.300 +because you just evaluate it and then you learn that like the number is better on the side could + +00:35:13.420 --> 00:35:17.760 +learn side of things yeah that's when you feel like oh maybe i did it wrong where you paid + +00:35:17.760 --> 00:35:22.320 +a bunch of money to run expensive slow lms and you're like i would just use well so it's funny + +00:35:22.400 --> 00:35:26.940 +you say that so i'm i've actually been talking to a bunch of people that do lms at companies here in + +00:35:26.940 --> 00:35:31.480 +the netherlands and you know you you go to a pi data you go to a conference and you give them just + +00:35:31.600 --> 00:35:37.799 +enough beer so they're honest in that kind of a situation the nda curtain opens just a little or + +00:35:37.820 --> 00:35:38.040 +or whatever. + +00:35:38.640 --> 00:35:40.960 +Plus, more deniability is the name of the game in this one, yes. + +00:35:42.039 --> 00:35:45.720 +But then the stories you hear, which I did find somewhat encouraging, + +00:35:45.890 --> 00:35:48.920 +is they do all kind of go, well, there's budget now to do AI stuff, + +00:35:49.250 --> 00:35:51.460 +which means that we try out all the different LLMs, + +00:35:51.490 --> 00:35:53.620 +and we really can invest in evals and that sort of thing. + +00:35:54.250 --> 00:35:56.900 +And funnily enough, we also put some base models in there, + +00:35:57.070 --> 00:35:59.220 +like as a standard benchmark that we should beat. + +00:35:59.600 --> 00:36:02.020 +And I've heard a story a bunch of times now + +00:36:02.180 --> 00:36:04.400 +that because of the hype around LLMs and AI, + +00:36:05.190 --> 00:36:06.800 +after it was implemented, after they did all the benchmarks, + +00:36:07.300 --> 00:36:10.660 +It turns out that AI is the reason that scikit-learn is now in production in a bunch of places. + +00:36:10.790 --> 00:36:13.480 +And it's also the same thing with like spaCy. + +00:36:13.550 --> 00:36:19.300 +Because what a lot of people do learn is that, hey, if the spaCy model or like the lightweight model, so to say, + +00:36:19.440 --> 00:36:23.940 +is like somewhat equivalent to the LLM model after you give it like enough training data, + +00:36:24.030 --> 00:36:26.980 +which you do want to have either anyway, because you need to need that for evaluations. + +00:36:27.620 --> 00:36:32.020 +Well, typically those models are more lightweight and they will always produce the same output. + +00:36:32.240 --> 00:36:34.160 +Same thing goes in, same prediction will always come out. + +00:36:34.480 --> 00:36:37.420 +And you can really fence it off, have it on like a normal VM. + +00:36:37.700 --> 00:36:38.600 +That's also totally fine. + +00:36:39.000 --> 00:36:42.040 +Oh, and you know, another benefit, it's super lightweight to run. + +00:36:42.180 --> 00:36:44.360 +You just need a Lambda function and you're good, + +00:36:44.820 --> 00:36:48.000 +as opposed to like a GPU or like a huge cloud bill or something like that. + +00:36:48.260 --> 00:36:51.860 +So some of the stories that I'm hearing do suggest that, + +00:36:52.020 --> 00:36:56.180 +okay, these LLMs are also helping out like standard data science work, if it were, + +00:36:57.000 --> 00:37:00.000 +if only because management now really does want to be serious about the investment + +00:37:00.100 --> 00:37:01.400 +and really wants to do the experiments. + +00:37:03.040 --> 00:37:07.460 +portion of Talk Python To Me is brought to you by NordStellar. NordStellar is a threat exposure + +00:37:07.720 --> 00:37:13.200 +management platform from the Nord security family, the folks behind NordVPN that combines dark web + +00:37:13.380 --> 00:37:18.780 +intelligence, session hijacking prevention, brand abuse detection, and external attack service + +00:37:19.100 --> 00:37:24.220 +management. Keeping your team and your company secure is a daunting challenge. That's why you + +00:37:24.360 --> 00:37:29.979 +need NordStellar on your side. It's a comprehensive set of services, monitoring, and alerts to limit + +00:37:30.000 --> 00:37:35.800 +your exposure to breaches and attacks and act instantly if something does happen. Here's how + +00:37:35.800 --> 00:37:41.660 +it works. Nordstellar detects compromised employee and consumer credentials. It detects stolen + +00:37:42.260 --> 00:37:48.180 +authentication cookies found in InfoStealer logs and dark web sources and flags compromised devices, + +00:37:48.640 --> 00:37:54.839 +reducing MFA bypass ATOs without extra code in your app. Nordstellar scans the dark web for + +00:37:54.740 --> 00:38:01.340 +cyber threats targeting your company. It monitors forums, markets, ransomware blogs, and over 25,000 + +00:38:01.580 --> 00:38:07.020 +cybercrime telegram channels with alerting and searchable contexts you can route to Slack or your + +00:38:07.260 --> 00:38:13.300 +IRR tool. Nordstellar adds brand and domain protection. It detects cyber squats and lookalikes + +00:38:13.760 --> 00:38:19.040 +via visual, content similarity, and search transparency logs, plus broader brand abuse + +00:38:19.280 --> 00:38:24.300 +takedowns across the web, social, and app stores to cut the phishing risk for your users. + +00:38:24.640 --> 00:38:28.560 +They don't just alert you about impersonation, they file and manage the removals. + +00:38:29.120 --> 00:38:31.240 +Finally, Nordstellar is developer-friendly. + +00:38:31.840 --> 00:38:34.440 +It's available as a platform and an API. + +00:38:35.120 --> 00:38:36.000 +No agents to install. + +00:38:36.540 --> 00:38:39.660 +If security is important to you and your organization, check out Nordstellar. + +00:38:40.140 --> 00:38:42.380 +Visit talkpython.fm/nordstellar. + +00:38:42.490 --> 00:38:45.420 +The link is in your podcast player's show notes and on the episode page. + +00:38:45.880 --> 00:38:48.560 +Please use our link, talkpython.fm/nordstellar, + +00:38:49.000 --> 00:38:52.080 +so that they know that you heard about their service from us. + +00:38:52.420 --> 00:38:53.540 +And you know what time of year it is. + +00:38:53.840 --> 00:38:54.860 +It's late fall. + +00:38:55.290 --> 00:38:57.540 +That means Black Friday is in play as well. + +00:38:57.610 --> 00:39:01.600 +So the folks at Nord Stellar gave us a coupon, BlackFriday20. + +00:39:02.120 --> 00:39:05.200 +That's BlackFriday, all one word, all caps, 20, two, zero. + +00:39:05.700 --> 00:39:07.080 +That grants you 20% off. + +00:39:07.090 --> 00:39:11.640 +So if you're going to sign up for them soon, go ahead and use BlackFriday20 as a code, + +00:39:11.840 --> 00:39:13.220 +and you might as well save 20%. + +00:39:13.280 --> 00:39:15.800 +It's good until December 10th, 2025. + +00:39:16.740 --> 00:39:20.080 +Thank you to the whole Nord security team for supporting Talk Python To Me. + +00:39:21.120 --> 00:39:24.320 +You could say, we're going to use your mandate is to add AI. + +00:39:24.700 --> 00:39:26.640 +And then you go user, well, we got to see how well it's working. + +00:39:26.720 --> 00:39:27.500 +So we tested it. + +00:39:27.560 --> 00:39:29.820 +And then we have to compare it to something of a benchmark. + +00:39:30.600 --> 00:39:31.000 +Exactly. + +00:39:32.900 --> 00:39:36.800 +The benchmark is fast and effectively free and about as good. + +00:39:36.980 --> 00:39:38.260 +So we're good. + +00:39:38.380 --> 00:39:39.000 +We're just going to do that. + +00:39:39.120 --> 00:39:40.840 +You know, I think it's fine. + +00:39:41.220 --> 00:39:44.900 +Some organizations need like mandate from above in order to get something done. + +00:39:45.220 --> 00:39:50.220 +And this LLM craze, if nothing else, does seem to have caused the mandate from above. + +00:39:50.900 --> 00:39:51.880 +I'm sure it is. + +00:39:52.260 --> 00:39:56.760 +So that's something I would keep in the back of your mind as well, dear listener. + +00:39:56.980 --> 00:40:00.200 +Like sometimes that mandate can also give you permission to do other things. + +00:40:00.540 --> 00:40:00.760 +Yeah. + +00:40:01.020 --> 00:40:05.440 +Not working for a large enterprise type company for quite a while. + +00:40:05.760 --> 00:40:11.420 +I imagine I'm pretty blind to how that is transforming directives from the top. + +00:40:11.460 --> 00:40:15.540 +But I'm sure bosses are like, I'm using ChatGPT and it's so much better than our software. + +00:40:15.760 --> 00:40:16.620 +What are we going to do about that? + +00:40:16.760 --> 00:40:21.400 +you know yeah it's uh well i mean i'm in the developer tooling space so i still talk to big + +00:40:21.520 --> 00:40:25.000 +companies and small companies as well you do notice that big companies they work differently + +00:40:25.070 --> 00:40:28.920 +than small companies that is certainly true um for better or worse like there's also good there's + +00:40:28.960 --> 00:40:32.960 +also good reasons why bigger companies like if i'm a bank like uh i'm pretty sure it's a good idea to + +00:40:33.020 --> 00:40:38.660 +also have some rules that i have to abide by which yeah i'm just thinking more of like how disconnected + +00:40:38.920 --> 00:40:43.619 +is the higher level management from the product and like how much do they just like dictate a thing + +00:40:43.760 --> 00:40:49.560 +Okay, so let's talk about some of the tools. One of the things you primarily focused on was using + +00:40:50.080 --> 00:40:56.240 +Simon Willison's LLM library. Tell us about this thing. So the funny thing about that library is + +00:40:56.260 --> 00:41:00.700 +that it's actually kind of more meant as a command line utility. I think that's kind of the entry + +00:41:00.920 --> 00:41:05.820 +point that he made. But then he also just made sure there was some sort of a Python API around it. + +00:41:06.480 --> 00:41:10.440 +And after looking at that library, and also after playing around with all these other libraries, + +00:41:11.460 --> 00:41:13.200 +and I'm about to say this, but this is a compliment. + +00:41:13.750 --> 00:41:16.220 +I just found the LLM library by Simon Willison + +00:41:16.740 --> 00:41:18.060 +by far to be the most boring. + +00:41:18.660 --> 00:41:21.400 +And I mean that really in a good way, just unsurprising, + +00:41:22.130 --> 00:41:23.060 +only does a few things. + +00:41:23.220 --> 00:41:25.580 +The few things that it does is just in a very predictable way. + +00:41:26.040 --> 00:41:28.080 +And especially if you're doing rapid prototyping, + +00:41:28.280 --> 00:41:29.760 +what I felt was just kind of nice is that + +00:41:30.010 --> 00:41:31.360 +it does feel like you have building blocks + +00:41:31.390 --> 00:41:32.540 +that allow you to build very quickly, + +00:41:33.150 --> 00:41:35.860 +but it doesn't necessarily feel like you're dealing with abstractions + +00:41:35.920 --> 00:41:37.640 +that can sort of wall you in at some point. + +00:41:38.120 --> 00:41:40.400 +So for a hello world getting started, + +00:41:40.840 --> 00:41:42.920 +just do some things in a very Pythonic way, + +00:41:43.900 --> 00:41:46.060 +this boring library really did the trick. + +00:41:46.200 --> 00:41:48.920 +And I'm also happy to report it's still a library that I use. + +00:41:49.300 --> 00:41:51.060 +It's still definitely a library in my tool, though. + +00:41:51.140 --> 00:41:52.360 +Whenever I want to do something real quick, + +00:41:53.340 --> 00:41:55.840 +that is definitely a tool that I refer to. + +00:41:56.360 --> 00:41:57.580 +And one of the things-- + +00:41:57.580 --> 00:42:01.620 +Is it kind of an abstraction across the different LLMs + +00:42:01.840 --> 00:42:02.540 +and that kind of stuff? + +00:42:02.680 --> 00:42:05.900 +If I want to talk to Anthropic versus OpenAI, that doesn't matter? + +00:42:06.240 --> 00:42:09.540 +So the way that this is built is also, I think, with good taste. + +00:42:09.700 --> 00:42:15.260 +So what you don't want to do as a library is say, like, I'm going to basically support every library under the sun. + +00:42:15.440 --> 00:42:20.000 +Because as a single maintainer in the case of Simon Willis, you're just going to drown in all these different providers. + +00:42:20.240 --> 00:42:22.860 +So what he went for instead was a plugin ecosystem. + +00:42:23.200 --> 00:42:30.440 +Now, the downside of a plugin ecosystem is that then you defer the responsibility of maintaining a plugin for a specific source to another maintainer. + +00:42:30.800 --> 00:42:34.000 +But you might get someone who works at Mistral to make the Mistral plugin. + +00:42:34.200 --> 00:42:37.740 +And you might get someone who works at Open Router to make the Open Router plugin, etc. + +00:42:38.400 --> 00:42:42.220 +So you do distribute the workload in kind of a nice way. + +00:42:42.580 --> 00:42:45.660 +And all the bigger models are definitely supported. + +00:42:45.820 --> 00:42:48.060 +So the Anthropic and the OpenAI ones, + +00:42:48.200 --> 00:42:49.240 +those are just always in there. + +00:42:50.100 --> 00:42:54.140 +But you will also definitely find some of the more exotic ones + +00:42:54.230 --> 00:42:56.000 +that they will also just have a plug in themselves. + +00:42:56.420 --> 00:42:57.960 +And one thing that also helps under the hood + +00:42:58.100 --> 00:43:01.240 +is that OpenAI has a standard under the hood. + +00:43:01.360 --> 00:43:03.700 +Their SDK has become a bit of a standard across industry. + +00:43:04.130 --> 00:43:07.160 +So you can also reuse the OpenAI stuff. + +00:43:07.420 --> 00:43:09.200 +It would not surprise me at all that if you were just + +00:43:09.260 --> 00:43:11.640 +to change a URL in the setting, that you can also + +00:43:11.990 --> 00:43:14.860 +connect to Mistral via the OpenAI objects. + +00:43:15.020 --> 00:43:16.980 +I would have to double check, but that wouldn't surprise me. + +00:43:17.200 --> 00:43:20.140 +Yeah, that is a very common way of just like, + +00:43:20.230 --> 00:43:22.700 +you know what, we're all going to just adopt Open. + +00:43:23.260 --> 00:43:24.140 +It's a little like S3. + +00:43:24.660 --> 00:43:26.160 +Like when I was saying S3 earlier, + +00:43:26.330 --> 00:43:27.820 +I was actually talking to DigitalOcean. + +00:43:28.540 --> 00:43:29.160 +But it doesn't matter. + +00:43:29.420 --> 00:43:31.220 +I'm just still using Boto3 to talk to it. + +00:43:31.440 --> 00:43:33.880 +Yeah, it does feel weird like, oh, you want to use DigitalOcean? + +00:43:34.100 --> 00:43:39.260 +you have to download a SDK from a competing cloud provider and then you can. + +00:43:39.940 --> 00:43:41.180 +Exactly. It's so weird. + +00:43:41.390 --> 00:43:45.440 +But I mean, the thing I do find that it's just a little bit funny here is like technically, + +00:43:45.740 --> 00:43:49.800 +this thing is meant to be used from the command line. It just happens that it's written in Python + +00:43:49.930 --> 00:43:54.040 +and it just happens that also it has a decent Python API. And that's the thing I actually end up + +00:43:54.120 --> 00:43:58.560 +using more than stuff from the command line. That's because I do a lot of things in notebooks. So the + +00:43:58.700 --> 00:44:01.180 +stuff that I tend to use is a little bit more in the Python side. + +00:44:01.460 --> 00:44:02.600 +Sure, of course. Same here. + +00:44:02.620 --> 00:44:04.820 +Yeah, I don't think I would use it very much on the command line. + +00:44:05.080 --> 00:44:11.180 +But, you know, I can see using it as a step in a series of things happening on the command line, + +00:44:11.560 --> 00:44:15.980 +like some sort of orchestration, like X, Y, ask the LLM, Z, you know? + +00:44:16.280 --> 00:44:20.940 +Well, the main thing I actually use it for from the command line is you can do a git diff + +00:44:21.220 --> 00:44:25.860 +on whatever you're working on and then sort of pipe that through to the LLM to make the commit message. + +00:44:26.120 --> 00:44:29.900 +Like there are like these little moments like that where you do want to have a little automation, + +00:44:29.980 --> 00:44:34.640 +maybe in CI and like being able to do this from a command line is definitely useful it's just not + +00:44:34.640 --> 00:44:38.600 +the thing i use it for most or like if i use it that way it's a thing i automate once and then + +00:44:38.680 --> 00:44:44.100 +never really look at it but the interaction really for me is more in the notebook oh yeah that's a + +00:44:44.260 --> 00:44:48.940 +very cool librarian i definitely need for um like one thing that might be fun to point out because + +00:44:49.040 --> 00:44:54.040 +that's something i built i think uh around the same time like if you were to type into google + +00:44:54.080 --> 00:45:01.300 +uh smart funk and then my github alias koning a k-o-a-n-i-n-g um because it's a little + +00:45:02.100 --> 00:45:07.020 +okay you already had it open good man i'm really good with google actually i use a start page these + +00:45:07.180 --> 00:45:10.920 +days but i'm sure i could find it there oh okay here you're well you have an ergonomic keyboard + +00:45:11.100 --> 00:45:16.720 +so there you go quick typing um already yeah no so like a thing i was able to build on top of uh + +00:45:16.860 --> 00:45:24.020 +the thing that simon willison made and it's something that appears in the course as well + +00:45:24.040 --> 00:45:28.720 +that. Yeah, there you go. So basically what this function does is you can add a decorator. + +00:45:30.150 --> 00:45:35.220 +You just have a normal Python function. The inputs, you just put types on it. So you can say like, + +00:45:35.230 --> 00:45:40.380 +this input is a string, this input is an integer, or what have you. You then add a decorator that + +00:45:40.540 --> 00:45:45.520 +says what backend you want to use. So GPD4 or anything that Simon Willis and supports. And then + +00:45:45.530 --> 00:45:51.260 +the doc string is the prompt you give to the LLM. So you can do something like, hey, I have a function + +00:45:51.280 --> 00:45:52.820 +that accepts a term called paragraph. + +00:45:53.420 --> 00:45:53.880 +That's a string. + +00:45:54.270 --> 00:45:55.960 +And then the doc string of the function says, + +00:45:56.150 --> 00:45:56.660 +summarize this. + +00:45:56.970 --> 00:45:59.120 +And then lo and behold, you now have a Python function + +00:45:59.260 --> 00:46:00.040 +that can do summaries. + +00:46:00.860 --> 00:46:01.220 +I see. + +00:46:01.620 --> 00:46:04.880 +And the doc string can be effectively a Jinja template. + +00:46:05.460 --> 00:46:05.520 +Yes. + +00:46:05.820 --> 00:46:07.560 +If you wanted to, it could be a Jinja template. + +00:46:08.200 --> 00:46:09.780 +Alternatively, what you can also do + +00:46:10.600 --> 00:46:12.420 +is you can also say, well, what the function returns + +00:46:12.660 --> 00:46:13.140 +is the prompt. + +00:46:13.340 --> 00:46:14.500 +That's also something you can do. + +00:46:15.060 --> 00:46:15.400 +Got it. + +00:46:16.080 --> 00:46:18.960 +But this is just a very quick hack for some of the stuff + +00:46:19.040 --> 00:46:19.940 +that I want to do real quick. + +00:46:20.860 --> 00:46:22.960 +I don't necessarily recommend other people to use this, + +00:46:23.090 --> 00:46:25.440 +but this is something you can build on top of the LLM library + +00:46:25.620 --> 00:46:26.280 +from Simon Willison. + +00:46:26.470 --> 00:46:28.100 +So if you want to make your own syntactic sugar, + +00:46:28.210 --> 00:46:29.640 +your own abstractions, it's definitely something + +00:46:29.760 --> 00:46:30.660 +that you can also do. + +00:46:31.700 --> 00:46:33.040 +And also, again, the name of the game here + +00:46:33.160 --> 00:46:33.760 +is quick iteration. + +00:46:34.660 --> 00:46:37.240 +So also feel free to think of your own tools now and again. + +00:46:37.290 --> 00:46:39.500 +But you want the base layer to be as boring as possible. + +00:46:39.720 --> 00:46:42.460 +And Simon Willison's library, again, does that in a good way. + +00:46:42.740 --> 00:46:43.340 +Right, right. + +00:46:43.460 --> 00:46:45.360 +Just don't make me think about the details. + +00:46:45.960 --> 00:46:46.120 +Yes. + +00:46:46.120 --> 00:46:48.780 +Let me swap out the model providers, which is actually + +00:46:48.800 --> 00:46:51.660 +doing quite a bit, but not conceptually. + +00:46:51.900 --> 00:46:52.040 +Yeah. + +00:46:52.530 --> 00:46:56.260 +So again, one thing I do think at this point in time, + +00:46:56.330 --> 00:46:59.200 +if you've not played around with this software stuff already, + +00:46:59.460 --> 00:47:01.900 +I do recommend maybe doing that sooner rather than later. + +00:47:02.380 --> 00:47:03.880 +And I think Simon Willison's approach, + +00:47:03.890 --> 00:47:04.880 +if you're already a Python person, + +00:47:04.930 --> 00:47:06.500 +it's probably the simplest way to get started. + +00:47:06.700 --> 00:47:10.400 +But I do want to warn people in the sense of these are just tools. + +00:47:10.920 --> 00:47:11.440 +Tools are useful. + +00:47:11.710 --> 00:47:12.120 +Tools are great. + +00:47:13.220 --> 00:47:15.360 +But at some point, it's not so much about the tools that you use, + +00:47:15.520 --> 00:47:16.980 +but more about how you think about the tools, + +00:47:17.240 --> 00:47:19.020 +Like the way you interact with these tools, + +00:47:19.060 --> 00:47:21.440 +is it not some, like you can use a hammer + +00:47:21.500 --> 00:47:22.240 +in many different ways, + +00:47:22.740 --> 00:47:23.640 +but the way you think about a hammer + +00:47:23.780 --> 00:47:24.860 +is like also equally important. + +00:47:25.020 --> 00:47:26.160 +- Yeah, 100%. + +00:47:26.360 --> 00:47:28.760 +So speaking of thinking about how this goes, + +00:47:29.500 --> 00:47:30.540 +one of the challenges is, + +00:47:30.780 --> 00:47:33.340 +so let's just take this example here. + +00:47:33.760 --> 00:47:36.180 +It says, the example on your smartphone, + +00:47:36.600 --> 00:47:38.460 +GitHub readme, it says generate summary, + +00:47:38.700 --> 00:47:40.500 +take some text, it says generate a summary, + +00:47:40.600 --> 00:47:42.320 +the following text, colon text. + +00:47:42.860 --> 00:47:43.260 +- Yes, text. + +00:47:43.440 --> 00:47:43.920 +- Super straightforward. + +00:47:44.480 --> 00:47:48.700 +But what if you, what if you said, change the prompt a little bit said, give me all of the + +00:47:48.940 --> 00:47:51.940 +Python libraries discussed in the following markdown content. + +00:47:52.500 --> 00:47:52.680 +Right. + +00:47:53.200 --> 00:47:59.880 +And do you want a novella back, like a novel back when, when really what you want is like, + +00:47:59.880 --> 00:48:03.100 +I want a list of strings and URLs or whatever. + +00:48:03.100 --> 00:48:07.360 +Like how do I get programmability instead of just chatability, I guess. + +00:48:07.680 --> 00:48:07.860 +Yeah. + +00:48:08.060 --> 00:48:13.140 +So the thing that Simon Willis's library does allow you to do, is you are able to + +00:48:13.160 --> 00:48:17.920 +say, well, I have a prompt over here, but the output that I'm supposed to get out, well, that has to be + +00:48:18.060 --> 00:48:22.360 +a JSON object of the following type. And then you can use Pydantic. So you can do something like, well, + +00:48:22.460 --> 00:48:25.760 +I expect a list of strings to come out over here in the case that you're trying to detect Python + +00:48:25.960 --> 00:48:30.840 +libraries or Pokemon names or what have you. Or you can say, well, there's more like a classification. + +00:48:31.060 --> 00:48:35.720 +I believe you can pass a literal with like only these values and that's an also constraint. And + +00:48:35.720 --> 00:48:41.160 +the reason that you can do that is some of these LLMs are actually trained on specific tasks. One of + +00:48:41.160 --> 00:48:45.340 +these tasks could be tool calling. Another one of these tasks that are trained for these days is + +00:48:45.430 --> 00:48:49.840 +that you have to say the right word here, something, something output, structured output, + +00:48:51.070 --> 00:48:56.700 +so that the LLM can sort of give a guarantee that if you declare that you get a list of strings out, + +00:48:56.710 --> 00:48:59.380 +that you actually do get a list of strings out. And that's something that these things are + +00:48:59.760 --> 00:49:04.340 +typically trained for. So that's something you can do. If you're using open source models, + +00:49:05.200 --> 00:49:09.960 +definitely check them out. They're cool. But quality does vary, is what I will say. + +00:49:10.960 --> 00:49:12.460 +So that's something they always keep in the back of your mind. + +00:49:12.500 --> 00:49:15.260 +And also the more complex you make your PyDonic objects, + +00:49:15.440 --> 00:49:18.220 +like if you make like a nested thing of a nested thing of a nested thing, + +00:49:18.420 --> 00:49:20.840 +at some point the LLM is going to have a bit of trouble with that. + +00:49:21.760 --> 00:49:22.820 +Even though PyDonic is of course great, + +00:49:23.040 --> 00:49:26.840 +but like you make it harder for the LLM to do the right thing at some point. + +00:49:27.740 --> 00:49:30.900 +But that's also a thing that definitely you want to use + +00:49:30.940 --> 00:49:32.040 +if you're doing stuff with software. + +00:49:32.220 --> 00:49:34.100 +Like the really cool thing about PyDonic is you can say, + +00:49:34.160 --> 00:49:36.000 +I expect something that looks like this to come out + +00:49:36.500 --> 00:49:39.480 +and you can force an LLM to also actually do that. + +00:49:39.640 --> 00:49:41.200 +Like, you will get the right types that come out. + +00:49:41.320 --> 00:49:43.680 +The contents might still be maybe a bit spooky. + +00:49:43.790 --> 00:49:45.360 +But at least you get the right types out, + +00:49:45.450 --> 00:49:46.980 +which makes it a lot more convenient to write software. + +00:49:47.440 --> 00:49:49.680 +You can say, I can loop over the things that sent back, + +00:49:49.770 --> 00:49:52.540 +and you're not into vague text parsing land. + +00:49:53.119 --> 00:49:54.800 +Well, the main thing you get is-- + +00:49:55.070 --> 00:49:58.100 +I remember when the first OpenAI models sort of started coming + +00:49:58.170 --> 00:50:00.540 +out, I worked back at Explosion. + +00:50:00.590 --> 00:50:01.820 +We made this tool called spaCy there. + +00:50:02.120 --> 00:50:04.160 +And one of the things that we were really bummed out about + +00:50:04.360 --> 00:50:06.220 +was LLMs can only really produce text. + +00:50:06.580 --> 00:50:11.760 +So this structural, like if you want to detect a substring in a text, for example, + +00:50:11.940 --> 00:50:13.380 +which is the thing that spaCy is really good at, + +00:50:13.820 --> 00:50:17.860 +then you really want to guarantee, well, I'm going to select a substring that actually did appear in the text. + +00:50:18.000 --> 00:50:22.180 +I want to know exactly where it starts and exactly where it ends, and that's the span that I want to select. + +00:50:22.780 --> 00:50:27.460 +So normally in NLP, text comes in and structured information comes out. + +00:50:27.780 --> 00:50:30.580 +But here come these LLMs and text comes in and text comes out. + +00:50:30.900 --> 00:50:34.360 +And then you've got to figure out a way to guarantee that structured output actually comes out. + +00:50:34.480 --> 00:50:36.800 +So that's the thing that the LN providers also acknowledge. + +00:50:36.940 --> 00:50:39.200 +So that's something that they're actually training these models for. + +00:50:39.300 --> 00:50:42.280 +But I think it's a lot better these days, + +00:50:42.360 --> 00:50:44.360 +but it was definitely a huge part of the problem + +00:50:45.060 --> 00:50:47.260 +like three to four years ago when these things just happened. + +00:50:47.400 --> 00:50:48.120 +It was super annoying. + +00:50:48.380 --> 00:50:52.700 +So if you use Pydantic, you define a Pydantic model + +00:50:52.840 --> 00:50:54.320 +that drives from Pydantic.basedModel. + +00:50:54.600 --> 00:50:57.700 +You put the types, type information, fields, et cetera. + +00:50:58.100 --> 00:50:59.260 +And then what does it do? + +00:50:59.360 --> 00:51:02.480 +Does it say, kind of use the schema generation + +00:51:02.800 --> 00:51:04.200 +that Pydantic already has? + +00:51:04.360 --> 00:51:06.240 +Like people are familiar with FastAPI. + +00:51:06.780 --> 00:51:10.220 +You set the response model, it'll do open API spec, + +00:51:10.380 --> 00:51:12.480 +which I believe just comes more or less from Pydantic, right? + +00:51:12.700 --> 00:51:13.460 +I could be. + +00:51:13.530 --> 00:51:16.420 +I think it's a JSON standard in the end. + +00:51:16.620 --> 00:51:18.580 +Like, you know how you can generate JSON schemas as well? + +00:51:18.880 --> 00:51:21.020 +I think it's that spec that is using under the hood + +00:51:21.100 --> 00:51:24.220 +that is then also giving off to OpenAI or what have you. + +00:51:24.310 --> 00:51:25.960 +So I think in the end, it's a JSON spec, + +00:51:26.050 --> 00:51:26.960 +but I could be wrong. + +00:51:27.000 --> 00:51:29.440 +But it's driven by the Pydantic model. + +00:51:29.570 --> 00:51:31.320 +And then that's given as part of the prompt. + +00:51:31.480 --> 00:51:32.400 +Like, here's my question. + +00:51:32.620 --> 00:51:34.140 +your answer will look like this. + +00:51:34.600 --> 00:51:36.560 +And it uses the Pydanic, something to that effect. + +00:51:36.860 --> 00:51:39.020 +Yeah, there's one small caveat there + +00:51:39.150 --> 00:51:41.560 +in the sense that you can also add custom types + +00:51:42.160 --> 00:51:42.820 +with Pydanic. + +00:51:42.830 --> 00:51:43.660 +So you can do something like, + +00:51:43.690 --> 00:51:45.980 +well, this is a non-negative integer, for example. + +00:51:46.440 --> 00:51:48.000 +And you can definitely imagine that, + +00:51:48.260 --> 00:51:49.140 +especially with these validations + +00:51:49.300 --> 00:51:50.100 +that you can write in Python, + +00:51:50.520 --> 00:51:52.080 +not everything that you can come up with there + +00:51:52.130 --> 00:51:53.660 +is going to be supported by this JSON spec. + +00:51:54.220 --> 00:51:56.380 +So you are going to get some of these weird edge cases + +00:51:56.560 --> 00:51:57.960 +maybe where Pydanic can do more + +00:51:57.990 --> 00:51:59.940 +than what the JSON spec can do on your behalf. + +00:52:00.420 --> 00:52:01.640 +There is a really cool library, though, + +00:52:01.820 --> 00:52:04.060 +called Instructor that actually-- + +00:52:04.060 --> 00:52:05.380 +Yes, I was just thinking about that. + +00:52:06.380 --> 00:52:08.340 +So what they do is actually kind of cute. + +00:52:09.760 --> 00:52:11.280 +The impression I have is that they acknowledge + +00:52:11.520 --> 00:52:13.620 +that PyDanic can actually do a bit more than what the JSON + +00:52:13.760 --> 00:52:16.520 +spec can do, especially because the validator that you write + +00:52:16.620 --> 00:52:18.760 +can really be a custom whatever thing you care about. + +00:52:19.680 --> 00:52:21.600 +So what they do as a trick is they say, well, + +00:52:21.800 --> 00:52:25.220 +if the LLM gives me something back that doesn't get validated + +00:52:25.460 --> 00:52:27.500 +by PyDanic, we take the error message, + +00:52:28.700 --> 00:52:29.880 +together with the thing that we got back, + +00:52:30.060 --> 00:52:32.360 +make a new prompt to the LLM and just ask it again. + +00:52:33.460 --> 00:52:34.740 +And see, maybe now with the hint, + +00:52:34.800 --> 00:52:37.080 +it is able to give us the proper structure back. + +00:52:37.300 --> 00:52:38.220 +That's sort of the term. + +00:52:38.220 --> 00:52:38.440 +It sounds silly, + +00:52:38.800 --> 00:52:40.800 +but anyone who's done agentic AI and stuff, + +00:52:41.060 --> 00:52:42.600 +like, you know, Claude or whatever, + +00:52:43.140 --> 00:52:44.800 +that's a lot of the magic where you say, + +00:52:44.920 --> 00:52:45.500 +it tried this. + +00:52:45.580 --> 00:52:46.300 +Oh, I see. + +00:52:46.800 --> 00:52:47.780 +I actually forgot an import. + +00:52:47.940 --> 00:52:48.520 +Oh, I see. + +00:52:48.760 --> 00:52:49.760 +I passed the wrong command. + +00:52:49.800 --> 00:52:51.020 +Let me try again a different way. + +00:52:51.160 --> 00:52:51.800 +Oh, now it's working. + +00:52:52.000 --> 00:52:52.440 +And, you know, + +00:52:52.960 --> 00:52:54.700 +I think there's a lot of potential with that. + +00:52:54.900 --> 00:52:56.200 +So, well, definitely true. + +00:52:56.400 --> 00:52:59.280 +The one thing I think is safe to say at this point, + +00:52:59.380 --> 00:53:00.900 +although another researcher, right? + +00:53:01.010 --> 00:53:02.140 +So don't pin me down on this. + +00:53:02.820 --> 00:53:05.880 +That library did originate from an early era of LLMs + +00:53:05.980 --> 00:53:07.440 +when the structured stuff wasn't really + +00:53:07.530 --> 00:53:08.620 +being trained for yet. + +00:53:09.500 --> 00:53:12.640 +And it is my impression that LLMs have gotten a fair bit better + +00:53:13.060 --> 00:53:15.620 +at this structured task at this point. + +00:53:15.800 --> 00:53:18.680 +So you could argue that maybe that library is solving + +00:53:18.780 --> 00:53:20.900 +a solved problem with the technique that they've got. + +00:53:21.320 --> 00:53:22.600 +But I still don't want to discount it, + +00:53:22.630 --> 00:53:25.100 +because you inevitably, at some point, + +00:53:25.100 --> 00:53:27.060 +you might want to go for a very lightweight model + +00:53:27.160 --> 00:53:29.340 +that you can run locally that isn't fine-tuned + +00:53:29.340 --> 00:53:33.740 +structured output task. And in that situation, this technique can actually do a lot of good. + +00:53:33.910 --> 00:53:37.760 +Like you might be prompting the LLM for like three times, but you might be able to get away + +00:53:37.850 --> 00:53:41.320 +with that because the model is just so much lighter because it doesn't have to do all those tasks that + +00:53:41.320 --> 00:53:45.960 +those big LLMs are trained for. Right. Especially if you're running on your machine or you're + +00:53:46.030 --> 00:53:49.540 +self-hosting the model, you're not like, oh, it's going to cost us so much money. It's just, + +00:53:49.860 --> 00:53:53.720 +it'll take a little longer, but it's fine. Yeah. So like there's a lot of support for + +00:53:53.740 --> 00:53:55.720 +instructor. There's also Olama support for + +00:53:55.920 --> 00:53:57.760 +the LM library from Simon Willison, + +00:53:57.920 --> 00:53:59.580 +by the way. But yeah, + +00:54:00.200 --> 00:54:01.740 +you can definitely imagine this is still going + +00:54:01.740 --> 00:54:03.680 +to be relevant with the Olama stuff going forward. + +00:54:04.220 --> 00:54:05.960 +And also, it's a reasonable + +00:54:06.260 --> 00:54:07.220 +trick if you think about it. + +00:54:08.280 --> 00:54:09.960 +The main thing that we do is we just add a retry + +00:54:10.080 --> 00:54:11.760 +mechanic to the structured outputs by using + +00:54:11.980 --> 00:54:13.400 +the PyDonic validator, effectively. + +00:54:13.859 --> 00:54:15.800 +I mean, think about what you probably do, certainly what + +00:54:15.840 --> 00:54:17.700 +I do when I'm working with ChatGPT or something. + +00:54:18.040 --> 00:54:19.740 +It gives me a wrong answer like, no, this is not what I + +00:54:19.860 --> 00:54:21.680 +wanted. This is what I wanted. It's like, oh, okay. + +00:54:21.940 --> 00:54:22.160 +I see. + +00:54:22.300 --> 00:54:24.820 +And it's kind of automating that, more or less. + +00:54:25.080 --> 00:54:26.660 +It's certainly an attempt, yeah. + +00:54:27.860 --> 00:54:30.160 +In my experience, I have also seen moments + +00:54:30.260 --> 00:54:33.640 +when it sends the same error back five times, + +00:54:33.680 --> 00:54:34.600 +and then it just kind of fails. + +00:54:34.780 --> 00:54:36.620 +So there is also a limit to the retry mechanic. + +00:54:37.100 --> 00:54:37.580 +Sure, sure. + +00:54:38.060 --> 00:54:40.700 +It's a little bit like tenacity or stamina. + +00:54:41.180 --> 00:54:41.360 +Exactly. + +00:54:41.780 --> 00:54:43.160 +APIs or whatever. + +00:54:43.500 --> 00:54:45.380 +It uses tenacity under the hood, if I'm not mistaken. + +00:54:45.640 --> 00:54:47.680 +So there's definitely that mechanism in there. + +00:54:47.920 --> 00:54:48.240 +Nice. + +00:54:48.540 --> 00:54:49.400 +I just-- didn't work. + +00:54:49.520 --> 00:54:49.860 +Try it again. + +00:54:50.080 --> 00:54:51.500 +It's like turn it off and turn it on again. + +00:54:51.680 --> 00:54:52.260 +See if it works now? + +00:54:52.600 --> 00:54:53.260 +Yep, pretty much. + +00:54:53.460 --> 00:54:53.540 +Television. + +00:54:54.020 --> 00:54:55.600 +Yeah, a little more smart than that, though, + +00:54:55.660 --> 00:54:57.740 +because it does validate and return the error. + +00:54:58.080 --> 00:54:58.920 +So super cool. + +00:54:59.180 --> 00:54:59.300 +Yep. + +00:54:59.660 --> 00:54:59.840 +Okay. + +00:55:00.580 --> 00:55:02.660 +Have you worked with, I guess, I don't know, + +00:55:02.940 --> 00:55:04.960 +higher order programming models? + +00:55:05.160 --> 00:55:07.920 +So, you know, I saw Samuel Colvin yesterday + +00:55:08.260 --> 00:55:10.000 +talking about Pydantic AI. + +00:55:10.400 --> 00:55:14.880 +I saw on X, I saw Sidney Ruckel talking about + +00:55:15.080 --> 00:55:17.040 +some big release, like a 1.0 release + +00:55:17.040 --> 00:55:18.020 +or something at LangChain. + +00:55:18.520 --> 00:55:19.700 +What are your thoughts on some of these + +00:55:19.680 --> 00:55:21.780 +higher order orchestration libraries? + +00:55:22.880 --> 00:55:25.700 +I mean, the main thing for me, at least, + +00:55:26.200 --> 00:55:28.340 +this was a demo I did do with the Pydanic AI one. + +00:55:29.360 --> 00:55:33.740 +I do think types are quite powerful when you think of LLMs. + +00:55:34.600 --> 00:55:37.880 +There's a live stream you can check on the Marimo YouTube + +00:55:38.060 --> 00:55:39.360 +if you're keen to see a full demo. + +00:55:39.900 --> 00:55:41.400 +But let me just sketch you a problem that + +00:55:41.400 --> 00:55:42.380 +used to be really hard. + +00:55:42.600 --> 00:55:44.820 +Six years ago, we used to work at this company called Raza. + +00:55:45.060 --> 00:55:46.800 +We work on chatbot software. + +00:55:47.240 --> 00:55:49.640 +And a thing that was really hard is you-- + +00:55:49.660 --> 00:55:51.160 +Let's say you're a chat bot, you can order a pizza. + +00:55:51.900 --> 00:55:53.920 +So you can do things like, yep, that's the one. + +00:55:54.380 --> 00:55:56.320 +It's like also one of the PyDonic AI engineers, + +00:55:56.800 --> 00:55:58.600 +and he's also the creator of Starlet, by the way. + +00:55:58.820 --> 00:56:00.580 +Cool project, Mariemost built on top of Starlet. + +00:56:01.519 --> 00:56:02.580 +Main maintainer of that project. + +00:56:02.860 --> 00:56:04.700 +He deserves to get more high fives from the community, + +00:56:04.940 --> 00:56:05.980 +so do you want to point it out? + +00:56:07.420 --> 00:56:08.840 +Well, let's say you're working on a pizza bot, + +00:56:09.600 --> 00:56:11.200 +and okay, someone is ordering a pizza. + +00:56:11.940 --> 00:56:13.180 +Suppose you go to the bot and you say, + +00:56:13.320 --> 00:56:15.480 +"Hey, I want to order a pepperoni pizza." + +00:56:15.860 --> 00:56:17.160 +But you can imagine there's a pizza type, + +00:56:17.540 --> 00:56:19.200 +and a pizza type says, "What's the kind of pizza? + +00:56:19.480 --> 00:56:20.360 +What's the size of the pizza? + +00:56:20.840 --> 00:56:21.820 +And do you want to have extra toppings? + +00:56:22.200 --> 00:56:23.900 +And that has to be a list of ingredients, let's say. + +00:56:24.240 --> 00:56:28.880 +But if I tell you, hey, you want to have a pepperoni pizza, + +00:56:29.060 --> 00:56:30.880 +but I don't know the size yet, well, then I + +00:56:30.940 --> 00:56:33.120 +need the LLM to ask a good follow-up question. + +00:56:33.500 --> 00:56:35.380 +OK, how are you going to build that logic? + +00:56:35.880 --> 00:56:37.420 +Because that can get quite gnarly quite quickly, + +00:56:37.660 --> 00:56:39.140 +especially if, in this case, the pizza is simple. + +00:56:39.280 --> 00:56:40.740 +But you can come up with a complicated object, + +00:56:41.020 --> 00:56:42.300 +like a complicated form almost. + +00:56:42.840 --> 00:56:43.120 +And then-- + +00:56:43.120 --> 00:56:43.520 +Right, right. + +00:56:43.840 --> 00:56:46.560 +I only need to ask this question if they've answered that one + +00:56:46.620 --> 00:56:47.240 +in that way. + +00:56:47.720 --> 00:56:49.380 +Right, it could get really tricky, yeah. + +00:56:49.740 --> 00:56:50.720 +Well, and then the question is, + +00:56:50.920 --> 00:56:53.340 +how can you actually get the LLM to do this for you? + +00:56:53.740 --> 00:56:55.220 +And there's a trick that you can do with a type, + +00:56:55.420 --> 00:56:56.200 +because what you can say is, + +00:56:56.580 --> 00:56:58.840 +well, the response from the LLM has to be of type pizza, + +00:56:59.579 --> 00:57:00.720 +or bar string. + +00:57:01.300 --> 00:57:03.020 +And then if the prompt says something along the lines of, + +00:57:03.100 --> 00:57:05.600 +well, parse the output of the user such that it is a pizza, + +00:57:06.020 --> 00:57:08.200 +but then if this validation error occurs, + +00:57:08.320 --> 00:57:09.980 +oh, how do you express that logic? + +00:57:10.220 --> 00:57:11.320 +Well, you can express it with a type. + +00:57:11.580 --> 00:57:12.480 +You just say, this is a string, + +00:57:12.960 --> 00:57:14.980 +or it's the type that I'm actually interested in the pizza. + +00:57:15.420 --> 00:57:16.680 +And then the LLM has to figure out, + +00:57:16.860 --> 00:57:20.500 +is the response either going to be the string where I'm asking the user for missing information + +00:57:21.040 --> 00:57:24.600 +or is it going to be the pizza object and if it's a pizza object then the code will actually talk + +00:57:24.740 --> 00:57:30.480 +to the back end so you do get to rethink how typing works because we have llms at our disposal now you + +00:57:30.480 --> 00:57:34.700 +can actually have the llm handle some of that logic as long as your type is like strict about + +00:57:34.710 --> 00:57:40.360 +it or like well defined i should say um it's it's a bit of a weird advanced feature of types and llms + +00:57:40.540 --> 00:57:46.240 +but i can definitely imagine uh some of the pydantic ai stuff does have does use this trick in order + +00:57:46.140 --> 00:57:51.020 +to perform some of the higher order stuff and maybe you don't need a full framework as long as you are + +00:57:51.180 --> 00:57:56.520 +really um like deliberate about your types is what i that that's kind of the mentality i try to + +00:57:56.520 --> 00:58:00.320 +have right now like let's interesting let's not worry about like the big framework just yet if we + +00:58:00.320 --> 00:58:04.000 +can just focus on the right types and make sure that all that stuff is kind of sane then you also + +00:58:04.080 --> 00:58:08.940 +keep the extractions at bay which is i think also convenient especially early on it sounds a little + +00:58:09.100 --> 00:58:15.400 +bit like an analogy you might be able to draw with databases like you could have like a really strong + +00:58:15.420 --> 00:58:21.880 +structured Postgres or relational database where the database thing, aka the AI, is in charge of it. + +00:58:22.140 --> 00:58:29.980 +Or you could be using an ODM or ORM where the code will only accept responses, right? So like, + +00:58:29.990 --> 00:58:34.100 +for example, you can get away talking to a document database if you have strongly typed + +00:58:34.260 --> 00:58:37.740 +models. If you're just passing dictionaries, you're going to end up in a bad place real quick + +00:58:38.100 --> 00:58:42.480 +because something's not going to match, right? So it's kind of like the Pydanic thing is being a + +00:58:42.520 --> 00:58:46.220 +check for if it's not quite perfect. I don't know. I feel like there's an analogy there. + +00:58:46.360 --> 00:58:52.520 +There's a bit of an analogy there. The main thing with the LLM is like, oh, how much of the logic do + +00:58:52.580 --> 00:58:57.140 +you want to control yourself as a guarantee? And how much would you like the LLM to do it? And + +00:58:57.540 --> 00:59:02.080 +oh, that's always kind of a struggle. Like what is the right glue for that? And it turns out that + +00:59:02.240 --> 00:59:07.740 +Pydantic is in a very funny, unique position to say, how about we just use types? And that actually + +00:59:07.760 --> 00:59:08.420 +It just kind of works. + +00:59:08.859 --> 00:59:14.100 +How ironic is that that that's such a popular thing coming out of a effectively untyped language? + +00:59:15.300 --> 00:59:15.600 +Yeah. + +00:59:16.140 --> 00:59:17.460 +Maybe that's why it's so popular. + +00:59:18.040 --> 00:59:19.100 +I think it partly is. + +00:59:19.320 --> 00:59:23.100 +I mean, it's my impression that there's a lot of this stuff also happening on the JavaScript + +00:59:23.300 --> 00:59:24.120 +side of things, if I'm honest. + +00:59:24.400 --> 00:59:27.100 +Now, for the JavaScript side of things, the argument is, of course, oh, you can also build + +00:59:27.100 --> 00:59:30.380 +the web UIs a bit easier and the user experience and that whole story. + +00:59:30.920 --> 00:59:32.340 +But they also do a lot of this validation stuff. + +00:59:32.420 --> 00:59:34.420 +I forget the library, but there's also-- + +00:59:35.400 --> 00:59:36.700 +I'm learning a little bit of JavaScript now. + +00:59:37.100 --> 00:59:40.100 +there's actually pydantic things happening in JavaScript land as well. + +00:59:40.460 --> 00:59:42.100 +Yeah, it's probably jidantic. + +00:59:44.680 --> 00:59:45.340 +Just-dantic. + +00:59:46.600 --> 00:59:48.560 +Just-dantic, yeah, j-s-dantic. + +00:59:49.340 --> 00:59:50.820 +Or TypeScript or whatever. + +00:59:50.900 --> 00:59:54.240 +But there's also like ORM libraries for JavaScript that are like type E, + +00:59:54.300 --> 00:59:56.200 +but not like super TypeScript just yet. + +00:59:56.420 --> 00:59:59.420 +Yeah, there's a lot of interesting analogies from the JavaScript space for us. + +00:59:59.740 --> 01:00:03.680 +You know, like the whole TypeScript team just rewrote everything in Go, + +01:00:04.280 --> 01:00:06.700 +whereas we're seeing a lot of stuff in Python being modified. + +01:00:06.820 --> 01:00:12.600 +Rust. Rust. Yeah, exactly. So pretty interesting. But let's, let's close this out a little bit with + +01:00:12.740 --> 01:00:18.240 +like running some of our own models. Okay. So clearly there's Anthropic, there's OpenAI, + +01:00:18.620 --> 01:00:22.380 +there's, you know, all the big, huge foundational models, but some of these things we're talking + +01:00:22.560 --> 01:00:27.080 +about, it's like this identity schema validation type of thing. And so on, you know, maybe we could, + +01:00:27.340 --> 01:00:30.680 +as you said, get away with running a smaller model, maybe small enough that we could run it + +01:00:30.700 --> 01:00:39.080 +ourselves on our servers or on my Mac mini here m2 pro 32 gigs ram i have the um the open ai + +01:00:39.900 --> 01:00:45.060 +20 billion parameter model running as like my my default llm for any my code that i write that + +01:00:45.060 --> 01:00:50.480 +needs to talk to an llm so um yeah so we actually have a video on the maroon channel that one but + +01:00:51.020 --> 01:00:56.200 +go check that out later but basically i've recorded that link we'll put in the show notes yeah no so + +01:00:56.220 --> 01:01:01.060 +I kind of find myself exploring all these different open source LLMs like once every two months or so. + +01:01:01.190 --> 01:01:06.300 +And then Marimo has a couple of interesting integrations that makes it really easy to either hook into Olama or, + +01:01:06.490 --> 01:01:09.580 +and that's like the only thing I might change on the course that we made, + +01:01:09.860 --> 01:01:11.500 +there's now also the service called Open Router. + +01:01:11.790 --> 01:01:13.160 +Like I don't know if you've ever seen that one. + +01:01:13.170 --> 01:01:13.880 +Yes, I've heard of it. + +01:01:14.779 --> 01:01:17.180 +Because I was looking at Klein.bot. + +01:01:17.290 --> 01:01:18.380 +Are you familiar with Klein.bot? + +01:01:18.380 --> 01:01:18.600 +Oh, yeah. + +01:01:19.080 --> 01:01:19.460 +Klein is cool. + +01:01:19.720 --> 01:01:20.320 +Klein is really cool. + +01:01:20.680 --> 01:01:22.000 +I've been using that for a bit. + +01:01:22.980 --> 01:01:24.040 +Tell me what Klein is real quick. + +01:01:24.460 --> 01:01:28.080 +So Klein is basically Copilot for VS Code, + +01:01:28.130 --> 01:01:29.100 +but the one thing that they do + +01:01:29.110 --> 01:01:30.840 +is they really let you do any model, + +01:01:31.160 --> 01:01:31.900 +like just plug it in. + +01:01:32.280 --> 01:01:33.780 +And they are really strict about plan mode + +01:01:33.950 --> 01:01:36.060 +versus like run mode, if it makes sense. + +01:01:36.260 --> 01:01:37.100 +So you really got to, + +01:01:37.560 --> 01:01:39.340 +and one, the UI is kind of cool. + +01:01:39.410 --> 01:01:41.440 +So like every time that you're like planning + +01:01:41.490 --> 01:01:42.040 +or doing whatever, + +01:01:42.190 --> 01:01:45.060 +you kind of see the progress bar of your context. + +01:01:45.510 --> 01:01:47.060 +And then you also see how much money it costs. + +01:01:47.200 --> 01:01:48.880 +So you're like always aware of the fact that like, + +01:01:49.000 --> 01:01:50.720 +okay, this thing is costing me bucks now. + +01:01:51.759 --> 01:01:54.220 +And it's just a refreshing take on GitHub Copilot. + +01:01:54.440 --> 01:01:57.320 +And I will say, you can't do sessions in parallel, + +01:01:57.430 --> 01:01:59.980 +because it is, I think, at the moment, still stuck to the VS Code + +01:02:00.320 --> 01:02:01.080 +extension ecosystem. + +01:02:01.520 --> 01:02:02.040 +Could be wrong with that. + +01:02:02.140 --> 01:02:04.100 +They just introduced a CLI. + +01:02:04.540 --> 01:02:05.220 +Ah, there you go. + +01:02:05.500 --> 01:02:06.000 +There you go. + +01:02:06.090 --> 01:02:07.160 +I bet you can do it with that. + +01:02:07.400 --> 01:02:08.740 +Yeah, so that's also-- + +01:02:09.140 --> 01:02:11.320 +the main thing that drew me in was the whole, oh, + +01:02:11.720 --> 01:02:14.480 +they're very strict about plan mode versus Go mode, + +01:02:14.870 --> 01:02:16.300 +which I think is definitely super useful. + +01:02:17.250 --> 01:02:18.720 +And on top of that, you have some nice UI. + +01:02:19.020 --> 01:02:20.920 +Yeah, it's also real similar to Cursor and others. + +01:02:21.440 --> 01:02:23.080 +And I think it's a big selling point. + +01:02:23.180 --> 01:02:27.980 +open source and they don't charge for inference so you just put in your api key and it just just + +01:02:28.100 --> 01:02:31.840 +passes through but you're right they do have like the as it's thinking it's got a little tick tick + +01:02:31.880 --> 01:02:37.640 +tick price of this opera and it's like oh it's up to three cents five you know it's just interesting + +01:02:37.840 --> 01:02:41.480 +to watch it go you also it's kind of like the fibonacci sequence you kind of realize that it's + +01:02:41.680 --> 01:02:45.700 +additive so it's like oh i've spent like half a dollar now oh and the next problem is going to be + +01:02:45.760 --> 01:02:51.420 +half a dollar plus extra the next one's going to be like that extra exactly it makes you manage your + +01:02:51.420 --> 01:02:52.440 +your context size real well. + +01:02:53.040 --> 01:02:54.720 +So anyway, OK, that's a bit of a diversion. + +01:02:54.790 --> 01:02:56.580 +But I think this is a fun tool to shout out. + +01:02:57.140 --> 01:02:57.600 +No, I agree. + +01:02:57.780 --> 01:03:00.100 +So the reason I brought this up is I + +01:03:00.340 --> 01:03:02.200 +saw they were number one on Open Router. + +01:03:02.280 --> 01:03:03.340 +I'm like, wait, what's Open Router? + +01:03:03.880 --> 01:03:06.400 +Yeah, so Open Router is basically you have one API key, + +01:03:06.620 --> 01:03:09.240 +and then they will route to any LLM that you like. + +01:03:09.760 --> 01:03:11.140 +And one thing that's really cool about it + +01:03:11.230 --> 01:03:12.920 +is that if there's a new open source model + +01:03:13.060 --> 01:03:15.340 +that you see all at Twitter and YouTube rave about, + +01:03:15.820 --> 01:03:16.780 +it'll be on Open Router. + +01:03:17.060 --> 01:03:18.160 +And then you can just give it a spin. + +01:03:18.780 --> 01:03:21.540 +If you have, let's say, stuff in your cache, + +01:03:21.550 --> 01:03:23.720 +and you've got a couple of functions that you use for evaluation, + +01:03:24.070 --> 01:03:25.100 +and you've got a setup for that, + +01:03:25.420 --> 01:03:29.200 +then it's just changing a single string to give a new LLM model a try. + +01:03:29.310 --> 01:03:31.440 +And they will make sure everything is nicely routed. + +01:03:32.260 --> 01:03:34.420 +And that's like the whole service also, basically. + +01:03:34.870 --> 01:03:38.980 +I think they add like a 5% fee hike or something like that, + +01:03:39.010 --> 01:03:42.160 +but they've got all sorts of GPU providers competing for the lowest price. + +01:03:42.770 --> 01:03:46.020 +So they do all the routing to all the LLM models, + +01:03:46.110 --> 01:03:47.400 +but also the open source ones. + +01:03:47.450 --> 01:03:48.540 +And that's the main convenient bit. + +01:03:48.840 --> 01:03:54.460 +Because I might not have-- so it's partially that I don't have the biggest GPU for the biggest models, + +01:03:55.020 --> 01:03:57.000 +but it's really convenient to just do kind of a for loop. + +01:03:57.310 --> 01:04:02.600 +Like, okay, take the 7B model from Qwenn, the 14B model, and just loop over all of them, + +01:04:02.690 --> 01:04:06.860 +and see if there's like a sweet spot as far as like quality is concerned and also model size. + +01:04:07.040 --> 01:04:11.600 +Like those kinds of experiments become a whole lot simpler if you just have one API that has all the models. + +01:04:11.690 --> 01:04:13.140 +And OpenRouter kind of gets you there. + +01:04:13.380 --> 01:04:14.720 +Yeah, okay, very neat. + +01:04:14.960 --> 01:04:16.420 +And it's also why Klein likes using it. + +01:04:16.680 --> 01:04:17.980 +So they have a direct connection with Klein. + +01:04:18.540 --> 01:04:20.020 +Just in the user interface, you can say, + +01:04:20.100 --> 01:04:21.440 +well, I want to have this model from Open Router. + +01:04:21.740 --> 01:04:23.560 +Just click, click, click, and it's all sort of ready to go. + +01:04:24.030 --> 01:04:24.800 +But that's their service. + +01:04:25.140 --> 01:04:25.880 +OK, very neat. + +01:04:26.160 --> 01:04:29.820 +So maybe for people who haven't run their own models, + +01:04:30.060 --> 01:04:31.660 +there's a couple of options. + +01:04:32.240 --> 01:04:33.600 +Sounds like you recommend Olama. + +01:04:34.020 --> 01:04:35.280 +I mean, that's the one that I have used. + +01:04:35.720 --> 01:04:37.340 +You've used LM Studio, I think? + +01:04:37.560 --> 01:04:40.040 +I use LM Studio because LM Studio + +01:04:40.240 --> 01:04:42.580 +has this UI for discovering and setting them up + +01:04:42.580 --> 01:04:43.940 +and playing and configuring them. + +01:04:43.990 --> 01:04:45.660 +But then you can just say, run in dev mode, + +01:04:45.820 --> 01:04:47.540 +and it just runs, and here we go, + +01:04:47.850 --> 01:04:49.620 +an open AI compatible-- + +01:04:49.820 --> 01:04:50.680 +- There you go. - API endpoint + +01:04:50.970 --> 01:04:51.700 +on my local network. + +01:04:51.950 --> 01:04:54.660 +And so I can just point anything that can do ChatGPT, + +01:04:54.800 --> 01:04:56.640 +I can point it at this, whatever's running there. + +01:04:56.650 --> 01:04:59.360 +And I was running the GPT OSS 20 billion one + +01:04:59.500 --> 01:05:00.020 +last time I tried. + +01:05:00.480 --> 01:05:00.960 +- There you go. + +01:05:01.230 --> 01:05:01.320 +Okay. + +01:05:01.680 --> 01:05:03.880 +So Ollama also has like a very similar thing. + +01:05:04.520 --> 01:05:06.140 +My impression is that LM Studio in the end + +01:05:06.200 --> 01:05:06.880 +is like pretty similar, + +01:05:07.040 --> 01:05:08.420 +just has a slightly different UI. + +01:05:09.040 --> 01:05:10.200 +It's a bit more of an app, I suppose. + +01:05:10.350 --> 01:05:12.840 +Like Ollama is definitely more of a command line utility. + +01:05:13.960 --> 01:05:15.220 +Does the job though, I would say, + +01:05:15.340 --> 01:05:18.380 +but it doesn't have that much to offer in the UI department. + +01:05:18.670 --> 01:05:20.740 +Yeah, I think if I were ever to try to set it up on my server, + +01:05:21.220 --> 01:05:24.780 +I'd probably just do Olama, maybe inside a Docker or something. + +01:05:24.830 --> 01:05:25.780 +I don't know, something like that. + +01:05:25.980 --> 01:05:27.440 +Yeah, not true. + +01:05:27.520 --> 01:05:29.580 +The one sort of mysterious thing with a lot of these services, + +01:05:29.740 --> 01:05:30.660 +like how do they make money? + +01:05:31.300 --> 01:05:35.560 +And I think I read somewhere that Olama is going to start doing + +01:05:35.770 --> 01:05:37.320 +open router kind of stuff as well, + +01:05:37.410 --> 01:05:39.620 +so it wouldn't surprise me if they ended up going in that direction. + +01:05:40.750 --> 01:05:42.140 +But I mean, there's all sorts of vendors, + +01:05:42.310 --> 01:05:45.300 +and there's another thing I'm also seeing a lot of nowadays + +01:05:45.320 --> 01:05:49.080 +use like big models that generate a data set such that they can take a super tiny model and fine + +01:05:49.300 --> 01:05:53.200 +tune on the data that was generated by the big model and that way like a future maybe that we + +01:05:53.340 --> 01:05:58.200 +might have is that you have a dutch model that's really good for legal maybe we'll have like a + +01:05:58.280 --> 01:06:04.840 +super bespoke lm for that at some point or like a um like an lm that's like really good at css but + +01:06:04.920 --> 01:06:09.300 +only css like that's also a thing that we might have and then those models can be really tiny and + +01:06:09.340 --> 01:06:14.700 +run on your laptop and who knows this is the age we live in um but i certainly can foresee a world + +01:06:14.680 --> 01:06:19.000 +like that where it's like okay i'm gonna work on a project instead of spinning up a general world + +01:06:19.200 --> 01:06:24.500 +knowledge model i'm gonna just let it on demand go okay i have a css question well what framework + +01:06:24.640 --> 01:06:29.420 +are you using i'm using tailwind okay we're gonna spend up the tailwind llm model and we need to ask + +01:06:29.520 --> 01:06:33.200 +it a question yeah we're gonna shut that you know what i mean like really focus but then have some + +01:06:33.300 --> 01:06:37.500 +kind of orchestrating thing that says now i gotta talk to this now let me let me work with that one + +01:06:37.570 --> 01:06:41.920 +but that could be great yeah and it might also be a solution to the whole oh the time cut off is + +01:06:41.880 --> 01:06:43.400 +January to 2025. + +01:06:43.920 --> 01:06:46.100 +And like everything happened after we have no knowledge about. + +01:06:46.280 --> 01:06:50.140 +Like it could also be a way for us to have a system where we just retrain a thing weekly + +01:06:50.240 --> 01:06:52.280 +and it's just a diff on top or something like that. + +01:06:52.420 --> 01:06:52.480 +Right. + +01:06:53.180 --> 01:06:53.480 +Yeah. + +01:06:53.620 --> 01:06:54.240 +It could also happen. + +01:06:54.540 --> 01:06:54.660 +Yeah. + +01:06:54.680 --> 01:06:56.320 +And if it were small enough, then it would be okay. + +01:06:56.720 --> 01:06:56.880 +Yeah. + +01:06:57.040 --> 01:06:58.380 +It would be kind of like SETI at home. + +01:06:58.540 --> 01:07:01.000 +All of our stuff would just be like kind of have the fan going. + +01:07:02.119 --> 01:07:06.420 +The, I mean, it's a bit of a science fiction thing, but you can imagine at some point every + +01:07:06.600 --> 01:07:10.560 +home will have solar panels and then maybe every home will also just have a GPU and maybe + +01:07:10.580 --> 01:07:12.600 +we won't have huge-ass server farms. + +01:07:12.740 --> 01:07:14.520 +It's just that we have a network, and everyone + +01:07:14.700 --> 01:07:16.480 +has a GPU, and if my neighbor suddenly + +01:07:16.780 --> 01:07:18.540 +wants to do math, then he can borrow my GPU. + +01:07:18.810 --> 01:07:19.900 +I don't know. We'll see. + +01:07:21.360 --> 01:07:22.620 +It kind of sounds cool, doesn't it? + +01:07:22.870 --> 01:07:24.660 +Yeah. If people like + +01:07:24.820 --> 01:07:25.400 +distributed systems, + +01:07:26.800 --> 01:07:27.860 +might as well go for it. + +01:07:29.200 --> 01:07:30.420 +If you want to add to the + +01:07:32.119 --> 01:07:32.700 +unpredictability of + +01:07:32.750 --> 01:07:34.600 +the LLM, mix in a distributed + +01:07:34.860 --> 01:07:36.540 +grid computing that you don't know about too much either, + +01:07:36.700 --> 01:07:37.540 +and then you'll get some real... + +01:07:38.680 --> 01:07:40.540 +I do want to give one quick shot + +01:07:40.560 --> 01:07:41.840 +on this one front, though, because I do think + +01:07:41.940 --> 01:07:43.120 +it's an idea that's worth pursuing. + +01:07:43.240 --> 01:07:45.880 +So in Holland, we've got this one cloud provider called + +01:07:46.080 --> 01:07:47.820 +LeafCloud, and I think they're still around. + +01:07:48.440 --> 01:07:51.460 +But what they do is they install these server farms, + +01:07:51.520 --> 01:07:52.460 +like what we're discussing. + +01:07:52.580 --> 01:07:56.720 +They have a GPU rack at the bottom of an apartment building. + +01:07:57.500 --> 01:08:00.640 +And any excess heat, which is one of the main cost drivers, + +01:08:01.080 --> 01:08:03.900 +that's actually used to preheat the water for the rest + +01:08:03.900 --> 01:08:04.380 +of the apartment. + +01:08:04.580 --> 01:08:04.720 +Yeah. + +01:08:05.440 --> 01:08:09.060 +So on paper, at least, you do have something that's not only + +01:08:09.080 --> 01:08:13.800 +carbon negative, but also more cost effective because the person, like the waste product of the + +01:08:13.890 --> 01:08:18.319 +of the compute is now a thing that the heat that someone is willing to pay for as well. So + +01:08:18.560 --> 01:08:21.200 +on paper, you can also be cheaper if you play your cards right. + +01:08:21.250 --> 01:08:26.720 +I ran across this and I was really inspired by it. I ended up not using LeafCloud for anything, + +01:08:27.020 --> 01:08:28.000 +but it's a cool idea. + +01:08:28.900 --> 01:08:33.100 +So the one downside of that approach is you do have to be close to a data center because + +01:08:33.589 --> 01:08:37.540 +do you really want to store your private data on disk in an apartment building or in a secure + +01:08:37.560 --> 01:08:41.980 +facility. So what they do is they just have glass fiber cables. Like the disc is in the facility, + +01:08:42.279 --> 01:08:45.819 +but the memory and the GPU and the computer is in the apartment building. That's sort of the way + +01:08:45.839 --> 01:08:51.859 +it's like a mounted volume over fiber, sort of like a network volume over fiber. Sure. I think + +01:08:51.960 --> 01:08:56.140 +that's pretty common. A lot of data center setups, it just happens to be spanning an apartment + +01:08:56.299 --> 01:09:00.839 +complex, which is unusual. Oh, it's also only for certain types of compute loads, right? You're + +01:09:00.940 --> 01:09:04.820 +probably not going to do your web app fire that apartment building. It's probably not going to be + +01:09:04.680 --> 01:09:09.400 +productive but if you're doing like big simulation big like deep learning things like that actually + +01:09:09.560 --> 01:09:15.540 +everyone everyone knows you can do machine training machine learning training and it's all good for + +01:09:15.580 --> 01:09:20.080 +serverless you just send the stuff out it just goes finds an apartment complex runs it serverless and + +01:09:20.140 --> 01:09:25.140 +it comes back right now just kidding so yeah it's a pretty neat idea yeah there's um we'll see what + +01:09:25.140 --> 01:09:29.780 +the future brings like when i did data science the joke i always made is like as a data scientist i + +01:09:29.799 --> 01:09:32.339 +I know for sure that the future is really hard to predict. + +01:09:35.580 --> 01:09:37.520 +Maybe the past is easier to predict, + +01:09:37.620 --> 01:09:38.600 +and that's what we do in machine learning. + +01:09:38.660 --> 01:09:40.299 +We try to predict the past, and yeah, + +01:09:40.420 --> 01:09:41.799 +we'll see what these LLMs do. + +01:09:41.980 --> 01:09:45.560 +But until then, again, also the repeating lesson from the course, + +01:09:46.120 --> 01:09:47.819 +there's some good habits, but in the end, + +01:09:47.900 --> 01:09:49.920 +it's not so much the LLM, it's more the stuff around + +01:09:50.060 --> 01:09:50.940 +and how you think about it. + +01:09:51.180 --> 01:09:53.240 +And also, small shout-out to Marimo. + +01:09:53.400 --> 01:09:55.940 +I do think having some UI in the blend as well, + +01:09:56.060 --> 01:09:57.820 +such as you can say, oh, there's a text box here, + +01:09:58.260 --> 01:09:59.020 +another text box there. + +01:09:59.220 --> 01:10:03.860 +you can kind of play around making your own little LLM app. Like the thing that I typically do is I + +01:10:04.010 --> 01:10:08.720 +hook up Gemini to my YouTube videos to make like summaries for my blog. I'd like try to find little + +01:10:08.860 --> 01:10:13.100 +excuses to like improve your life by automating something with these LLM tools. And that's a nice + +01:10:13.100 --> 01:10:17.840 +way to get started. Yeah, there's really, I think we all probably have little opportunities to automate + +01:10:18.280 --> 01:10:22.980 +some low stakes thing, but that we spend a lot of time with or hassle or whatever and let an LLM + +01:10:23.400 --> 01:10:25.380 +have at it and write a little Python to make that happen, right? + +01:10:25.980 --> 01:10:28.940 +Yeah, and also expose yourself to your own slop as well. + +01:10:30.590 --> 01:10:35.420 +The nice thing about building your own apps is also because you then suddenly start to realize when it's also still good to be a human. + +01:10:35.840 --> 01:10:39.560 +Like, I'm not going to have an LLM automate messages to my wife and my kid. + +01:10:40.000 --> 01:10:42.980 +That's just not going to happen because I've seen what these LLMs can output. + +01:10:43.280 --> 01:10:45.960 +Like, that's going to remain on the human side of the spectrum. + +01:10:46.860 --> 01:10:47.220 +Exactly. + +01:10:47.750 --> 01:10:51.320 +But making a summary for my blog on a video, I mean, that feels in the ballpark. + +01:10:51.820 --> 01:10:52.180 +100%. + +01:10:52.440 --> 01:10:52.640 +All right. + +01:10:52.960 --> 01:10:55.740 +Final word for people interested in this topic they want to build. + +01:10:56.480 --> 01:10:58.940 +I want to plug in LLMs as part of that building block. + +01:10:59.360 --> 01:10:59.760 +What do you say? + +01:11:00.940 --> 01:11:02.280 +Open Router is nice and flexible. + +01:11:02.640 --> 01:11:03.960 +LLM by Simon Willison is great. + +01:11:04.360 --> 01:11:08.020 +Disk cache is a thing that you do want to disk cache. + +01:11:09.840 --> 01:11:12.040 +Maybe a final thing that I do maybe want to... + +01:11:12.200 --> 01:11:14.760 +It's also fine if you just want to learn Python as well. + +01:11:15.000 --> 01:11:17.260 +There's some of this talk of people saying, + +01:11:17.400 --> 01:11:18.480 +oh, we never have to... + +01:11:19.100 --> 01:11:21.240 +Oh, actually, people from Peru are tuning in. + +01:11:21.600 --> 01:11:22.060 +Peru is amazing. + +01:11:22.440 --> 01:11:24.360 +from seeing people sort of replying the conversation. + +01:11:24.380 --> 01:11:24.900 +Yeah, absolutely. + +01:11:25.360 --> 01:11:25.760 +Oh, sweet. + +01:11:26.840 --> 01:11:27.960 +Best food of my life, Peru. + +01:11:28.350 --> 01:11:30.960 +Also, if you're looking for great excuses to learn Python, + +01:11:31.100 --> 01:11:31.380 +there's plenty. + +01:11:31.430 --> 01:11:33.460 +If you're looking for great excuses to go to Peru, food. + +01:11:35.300 --> 01:11:37.660 +But also, I think in the end, + +01:11:38.300 --> 01:11:40.720 +you are going to be better at using LLMs + +01:11:41.070 --> 01:11:42.200 +if you can still do things yourself. + +01:11:42.520 --> 01:11:44.360 +One thing I've started noticing is that + +01:11:44.660 --> 01:11:45.620 +it's a bit embarrassing, but like, + +01:11:45.690 --> 01:11:47.260 +oh, what's the CSS property of that thing? + +01:11:47.260 --> 01:11:48.000 +I used to know that. + +01:11:48.030 --> 01:11:49.100 +But nowadays, I just, oh, + +01:11:49.370 --> 01:11:51.440 +am I seriously going to ask this, the ChatGPT? + +01:11:51.500 --> 01:11:53.400 +This should be in my memory for Pete's sakes. + +01:11:53.520 --> 01:11:53.640 +Yes. + +01:11:54.140 --> 01:11:55.580 +And sometimes it's like, I'm going + +01:11:55.600 --> 01:11:57.380 +to have a whole conversation with ChatGPT + +01:11:57.440 --> 01:11:59.520 +so I can change four characters in a style sheet. + +01:11:59.620 --> 01:12:00.280 +What is this nonsense? + +01:12:00.840 --> 01:12:03.040 +So part of me is also thinking like, definitely + +01:12:03.300 --> 01:12:04.500 +expose yourself to LLMs. + +01:12:04.500 --> 01:12:06.000 +And if that inspires you, that's great. + +01:12:06.440 --> 01:12:10.100 +But also try to not overly rely on it either. + +01:12:10.620 --> 01:12:12.680 +It's like I'm building flashcard apps for myself. + +01:12:12.840 --> 01:12:15.420 +So I'm still kind of in the loop, if that makes sense. + +01:12:15.920 --> 01:12:19.020 +So you've got to be a guardian of your own mind. + +01:12:19.600 --> 01:12:21.380 +And yes, so I can see it so easily. + +01:12:21.500 --> 01:12:28.100 +that you just become a machine that just says next okay yes continue and then in six months you're + +01:12:28.100 --> 01:12:33.220 +like gosh i can't program anything i just got to ask the chat and it's it's a problem yeah it's + +01:12:33.320 --> 01:12:38.340 +it's something well and also um and that kind of gets back to the point i made earlier um you want + +01:12:38.340 --> 01:12:42.960 +to be very mindful that if you if you have better tools you should also have better ideas and if + +01:12:43.040 --> 01:12:48.520 +that's not the case then something is wrong because that's then you get into the self-learned + +01:12:48.540 --> 01:12:51.800 +helplessness territory. And that's the one thing I do want to kind of guard against. + +01:12:52.610 --> 01:12:56.260 +Andre Carpathy gave a great interview a little while ago where he mentioned the + +01:12:56.660 --> 01:12:59.900 +worst outcome of humanity. Did you see the movie WALL-E? Remember seeing that? + +01:13:00.000 --> 01:13:03.060 +Oh my gosh. That movie really made me sad. + +01:13:04.060 --> 01:13:08.060 +It's a good movie though. It's a fantastic movie, but as an adult, + +01:13:08.150 --> 01:13:10.660 +it's got a hard undertone to it. Oh my gosh. + +01:13:11.670 --> 01:13:16.300 +And especially now, because for people who never watched the movie, there's a scene in the movie where you see what humanity + +01:13:16.320 --> 01:13:20.020 +comes to and it's basically every everything has to be convenient so people are overweight + +01:13:20.540 --> 01:13:24.740 +they're in these like bikes that move them around they're always able to watch television drink a + +01:13:24.800 --> 01:13:29.800 +milkshake and eat a pizza like every convenience is sort of taken care of but you also just see + +01:13:29.900 --> 01:13:34.820 +that they're not happy and they can't even walk out like they can't they're like just they're so uh + +01:13:35.320 --> 01:13:39.320 +just dependent on the thing it's bad exactly so that's a sweet show it's a very sweet show i + +01:13:39.440 --> 01:13:43.900 +recommend it yeah it's it's a sweet movie but it's like if at some point you notice like hey + +01:13:44.340 --> 01:13:49.080 +learning feels hard, then maybe that's fine because maybe it's just like exercising. + +01:13:49.300 --> 01:13:50.960 +It's okay if that's a bit painful. + +01:13:51.120 --> 01:13:54.800 +In fact, if it's not painful, then maybe just participating in entertainment instead as well. + +01:13:54.840 --> 01:13:55.800 +So that's the thing. + +01:13:56.240 --> 01:13:57.680 +These are thoughts that I'm sort of battling with. + +01:13:57.760 --> 01:14:01.320 +I'm also sort of trying to be really critical of like, okay, I work at Marimo now. + +01:14:01.500 --> 01:14:03.280 +I've accessed all these LLMs and stuff. + +01:14:03.420 --> 01:14:06.340 +And can I build more awesome software that I wouldn't have before? + +01:14:06.660 --> 01:14:10.640 +One thing I made a while ago is I added gamepad support to Marimo Notebooks. + +01:14:10.780 --> 01:14:12.520 +So you can interact with Python with the gamepad and stuff. + +01:14:12.680 --> 01:14:13.640 +So this is just a experiment. + +01:14:14.330 --> 01:14:22.120 +But I would be really disappointed if I'm the only person making packages that I've never made before that are definitely trying to reach new heights. + +01:14:23.780 --> 01:14:26.360 +It does feel a bit lonely because it does feel like more people should do it. + +01:14:26.580 --> 01:14:28.860 +That's like the one sort of hint I want to give to people. + +01:14:29.260 --> 01:14:34.280 +Try to inspire yourself a bit more and do more inspirational stuff so more cool things happen on my timeline. + +01:14:34.700 --> 01:14:35.600 +There's less AI sloth. + +01:14:35.650 --> 01:14:38.000 +There's really cool software being built, people learning and all. + +01:14:38.460 --> 01:14:38.700 +Hear, hear. + +01:14:38.990 --> 01:14:39.560 +I appreciate that. + +01:14:39.720 --> 01:14:39.960 +All right. + +01:14:40.220 --> 01:14:42.660 +Well, Vincent, thanks for being back, + +01:14:43.180 --> 01:14:44.360 +sharing all these techniques. + +01:14:44.760 --> 01:14:45.800 +And yeah, I hope people go out there + +01:14:45.900 --> 01:14:48.580 +and build using AIs, not just with AIs. + +01:14:48.860 --> 01:14:49.060 +Definitely. + +01:14:49.440 --> 01:14:49.920 +Y'all have a good one. + +01:14:50.260 --> 01:14:50.760 +Yeah, see you later. + +01:14:50.920 --> 01:14:51.040 +Bye. + +01:14:51.720 --> 01:14:54.100 +This has been another episode of Talk Python To Me. + +01:14:54.600 --> 01:14:55.380 +Thank you to our sponsors. + +01:14:55.780 --> 01:14:57.160 +Be sure to check out what they're offering. + +01:14:57.440 --> 01:14:58.760 +It really helps support the show. + +01:14:59.180 --> 01:15:00.540 +Take some stress out of your life. + +01:15:00.920 --> 01:15:02.720 +Get notified immediately about errors + +01:15:03.060 --> 01:15:04.740 +and performance issues in your web + +01:15:04.900 --> 01:15:06.360 +or mobile applications with Sentry. + +01:15:06.860 --> 01:15:10.180 +Just visit talkpython.fm/sentry + +01:15:10.200 --> 01:15:11.240 +and get started for free. + +01:15:11.690 --> 01:15:14.900 +And be sure to use the promo code Talk Python, all one word. + +01:15:15.460 --> 01:15:17.480 +And it's brought to you by Nordstellar. + +01:15:18.000 --> 01:15:20.520 +Nordstellar is a threat exposure management platform + +01:15:21.100 --> 01:15:22.340 +from the Nord security family, + +01:15:22.610 --> 01:15:23.760 +the folks behind NordVPN + +01:15:24.340 --> 01:15:25.940 +that combines dark web intelligence, + +01:15:26.540 --> 01:15:28.100 +session hijacking prevention, + +01:15:28.700 --> 01:15:30.800 +brand and domain abuse detection, + +01:15:31.080 --> 01:15:32.780 +and external attack surface management. + +01:15:33.340 --> 01:15:35.360 +Learn more and get started keeping your team safe + +01:15:35.620 --> 01:15:38.320 +at talkpython.fm/Nordstellar. + +01:15:38.940 --> 01:15:51.840 +If you or your team needs to learn Python, we have over 270 hours of beginner and advanced courses on topics ranging from complete beginners to async code, Flask, Django, HTML, and even LLMs. + +01:15:52.220 --> 01:15:56.700 +Best of all, there's not a subscription in sight. Browse the catalog at talkpython.fm. + +01:15:57.100 --> 01:16:02.980 +Be sure to subscribe to the show. Open your favorite podcast player app. Search for Python. We should be right at the top. + +01:16:03.380 --> 01:16:06.760 +If you enjoy the geeky rap theme song, you can download the full track. + +01:16:07.110 --> 01:16:08.740 +The link is your podcast player show notes. + +01:16:09.480 --> 01:16:10.660 +This is your host, Michael Kennedy. + +01:16:11.030 --> 01:16:12.260 +Thank you so much for listening. + +01:16:12.410 --> 01:16:13.280 +I really appreciate it. + +01:16:13.700 --> 01:16:15.360 +Now get out there and write some Python code. + +01:16:25.760 --> 01:16:29.680 +Talk Python To Me, and we ready to roll + +01:16:31.260 --> 01:16:35.000 +Upgrading the code, no fear of getting old + +01:16:36.400 --> 01:16:39.900 +We tapped into that modern vibe, overcame each storm + +01:16:40.640 --> 01:16:43.480 +Talk Python To Me, I-Sync is the norm + +01:16:43.480 --> 01:16:44.180 +you + diff --git a/transcripts/529-python-apps-with-llm-building-blocks.txt b/transcripts/529-python-apps-with-llm-building-blocks.txt new file mode 100644 index 0000000..3b5e817 --- /dev/null +++ b/transcripts/529-python-apps-with-llm-building-blocks.txt @@ -0,0 +1,2284 @@ +00:00:00 A lot of people building software today never took the traditional computer science path. + +00:00:04 They arrived through curiosity, or a job that needed automating, or a late-night itch that + +00:00:09 made something work. This week, David Kopech joins me to talk about computer science for + +00:00:14 exactly these folks, the ones who learned to program first and are now ready to understand + +00:00:18 the deeper ideas that power the tools they use every day. This is Talk Python To Me, + +00:00:23 episode 529, recorded October 26, 2025. + +00:00:44 Welcome to Talk Python To Me, the number one Python podcast for developers and data scientists. + +00:00:49 This is your host, Michael Kennedy. I'm a PSF fellow who's been coding for over 25 years. + +00:00:55 Let's connect on social media. + +00:00:57 You'll find me and Talk Python on Mastodon, Bluesky, and X. + +00:01:00 The social links are all in your show notes. + +00:01:03 You can find over 10 years of past episodes at talkpython.fm. + +00:01:06 And if you want to be part of the show, you can join our recording live streams. + +00:01:10 That's right. + +00:01:11 We live stream the raw uncut version of each episode on YouTube. + +00:01:14 Just visit talkpython.fm/youtube to see the schedule of upcoming events. + +00:01:19 Be sure to subscribe there and press the bell so you'll get notified anytime we're recording. + +00:01:23 This episode is brought to you by Sentry. + +00:01:26 Don't let those errors go unnoticed. + +00:01:27 Use Sentry like we do here at Talk Python. + +00:01:29 Sign up at talkpython.fm/sentry. + +00:01:33 And it's brought to you by NordStellar. + +00:01:35 NordStellar is a threat exposure management platform from the Nord security family, + +00:01:40 the folks behind NordVPN, that combines dark web intelligence, + +00:01:44 session hijacking prevention, brand and domain abuse detection, + +00:01:48 and external attack surface management. + +00:01:50 Learn more and get started keeping your team safe at talkpython.fm/Nordstellar. + +00:01:56 David, welcome back to Talk Bythonomy. + +00:01:58 Thank you so much for having me back, Michael. + +00:02:00 It's really an honor. + +00:02:01 Got some really fun stuff to talk about, computer science from scratch. + +00:02:06 What does that even mean? + +00:02:07 We're going to find out because you wrote the book on it. + +00:02:10 Yeah, I'm excited. + +00:02:11 And this book just came out, so it's kind of fresh off the press. + +00:02:14 And I want to thank you for being the technical reviewer + +00:02:17 on the book, actually. + +00:02:18 Yeah, it was really fun. + +00:02:19 You reached out and asked me, and I don't normally do things like that, + +00:02:21 but I'm like, you know, that'd be kind of fun. It would be a good experience. And I do think it was. It taught + +00:02:26 me some about creating books and I guess that pays off as well. And congratulations, by the way, + +00:02:31 on Talk Python in production coming out. Yeah, thanks. I started that last December and it came + +00:02:36 out, I think the beginning of this month, maybe end of September, something like that, but pretty + +00:02:41 recently. So yeah, I really appreciate it. It's going really well. We're in kind of the same phase + +00:02:45 because computer science from scratch officially came out end of September also. So we're just in + +00:02:50 this kind of, you know, one month out from these books coming out. And so it's both, I think it's + +00:02:55 an exciting time for both of us. And I've started reading Talk Python in production, and I'm really + +00:02:59 loving it. So congrats on it. Awesome. Thank you. And I really enjoyed your book. And I had to read + +00:03:03 it with a little more detail than just appreciate it, right? I had to like take notes and help you + +00:03:08 with some feedback. So that was a really cool experience. And thanks. You know, it's been six, + +00:03:12 almost a little, it's over five years, let's put it that way. It's been over five years since you've + +00:03:16 on the show, we talked about classic computer science programs in Python, I believe it was. + +00:03:22 And yeah, that was really a fun and popular episode. But let's do a quick refresher, + +00:03:26 a quick introduction for you to everyone who's new to the show in the last five years. + +00:03:30 Okay. So I'm a computer. My name is David Kopech. I'm a computer science professor + +00:03:35 at Albright College in Redding, Pennsylvania. Actually just moved over here from Champlain + +00:03:40 College, where I was for the past nine years in a similar role. And I'm also the program director + +00:03:45 of Computer Science and Information Technology. We're launching three new majors, a little plug + +00:03:49 for Albright, a revamped computer science major, an artificial intelligence major, and cybersecurity. + +00:03:55 And so I'm managing the launch of those for fall 2026. My background is for the past decade, + +00:04:02 obviously, as a computer science educator. I've written five books on computer science and + +00:04:06 programming, the most successful of which was Classic Computer Science, Problems in Python, + +00:04:11 which of course I was on the show five years ago about. And this book, Computer Science from Scratch, + +00:04:17 is kind of for a similar audience. Both books are for Python programmers, folks who are intermediate + +00:04:22 or advanced Python programmers, but maybe who want to fill in the kind of computer science topics + +00:04:28 that they don't know, maybe because they're self-educated. Maybe they learned Python on + +00:04:31 their own. They didn't have a formal CS degree, or maybe they did have a formal CS degree, + +00:04:36 but now they're preparing for interviews or transitioning to a more CS in-depth role. + +00:04:41 And they want to refresh on some of these topics. + +00:04:43 So both books are for folks who know Python, but want to go deeper on CS. + +00:04:48 Right. They're not, here's how you do a loop in Python, or here's what InterTools is. + +00:04:53 It's true computer science topics, right? + +00:04:56 Right. And I think somebody at that stage will be very frustrated by the books. + +00:05:00 They really are. And so we have to put that preface in. + +00:05:03 And they really are for intermediate or advanced Python programmers. + +00:05:07 So, you know, so I'm trying to reach the same audience, but they're totally different books + +00:05:12 in terms of the type of topics they cover. + +00:05:14 So classic computer science problems in Python was more of a data structures and algorithms + +00:05:18 type of book, did some AI type topics as well, but very much on the algorithm side. + +00:05:24 Computer science from scratch is more, let's build up the layers of the software stack from + +00:05:29 the bottom up that Python kind of runs on top of. So you understand what's actually happening under + +00:05:33 the hood. So how does an interpreter work? What's the hardware software interface like? We get that + +00:05:39 with emulators. And then there is still some algorithmic stuff as well, different topics than + +00:05:44 in the prior book, but stuff like a little computer art stuff, a little bit of machine learning. So + +00:05:49 we try, it's still a survey book. It's still pretty broad, but it's more about the layers of + +00:05:54 the software stack than classic computer science problems in Python was. Yeah. And there's some + +00:05:58 pretty low-level stuff and it's interesting you're doing in python you know a lot of i think a lot of + +00:06:02 times that would be taught in um at least a pointer based language you know like c or even c c being + +00:06:09 taught in assembly like you're talking to the machine like literally absolutely and you know + +00:06:13 if you took these classes like if you took an um architecture class which i'm going to be teaching + +00:06:19 next semester at albright it's going to be an assembly language right and if you took a compilers + +00:06:23 course, it's usually in something like C or today it might be in a Rust, a low level language where + +00:06:29 you do have that direct access to memory. But I wanted to make the book accessible. And Python is + +00:06:34 the most accessible language in the world. It's the most popular language in the world. And it's + +00:06:38 the language that we're increasingly using in computer science education anyway. So it is + +00:06:42 actually a little unusual to cover some of these topics in Python, but it makes the topics accessible + +00:06:48 to a much wider audience. + +00:06:50 It definitely makes it accessible to a wider audience. + +00:06:53 Let's take a step back and talk about this role you have + +00:06:56 in this new college, university you're at. + +00:06:59 You said that you're revamping the CS program. + +00:07:02 You're adding AI and cybersecurity. + +00:07:04 That's a big shift. + +00:07:05 Do you want to just talk to people about how that's changing, + +00:07:08 why you decided to change? + +00:07:10 I look at a lot of CS programs, and I don't know how connected they are to the real world. + +00:07:15 So imagine some of these changes are to sort of realign it with what's happened recently. + +00:07:21 Absolutely. + +00:07:21 Yeah. + +00:07:22 So I'm coming from a college, the Champlain College in Vermont, where I was at for nine + +00:07:27 years, which was a professionally focused college. + +00:07:29 So it was a college where we were preparing students for their careers. + +00:07:33 I moved now to Albright, which is a liberal arts college. + +00:07:36 So it's a college that's more teaching people how to think, creating great citizens. + +00:07:40 And hopefully we prepare them for their careers very well, too. + +00:07:43 but there's that liberal arts foundation underneath everything. + +00:07:46 So one of the challenges in developing these three new majors + +00:07:50 was how do you fit them into a liberal arts curriculum? + +00:07:52 And how do you make them relevant to careers while still being true to the liberal arts? + +00:07:56 And so what I found the best way to do it is to still focus on how to help people think computationally. + +00:08:03 And so there's a firm foundation of computer science and mathematics throughout all three majors. + +00:08:08 So whether you're doing cybersecurity, artificial intelligence, or computer science, + +00:08:12 you're going to have a lot of classes in common with one another. And we also need to infuse some + +00:08:16 of the issues we see in the workforce today. Like all of them will have a computer ethics course + +00:08:21 that's incorporated. And all of them will have an internship course that's incorporated. So + +00:08:26 students are getting real world experience as they're going through these majors. + +00:08:29 I think that's a great idea. It's really important to have done software engineering + +00:08:35 hands-on with other companies, real products, and real product managers, + +00:08:39 in addition to just knowing the algorithms and foundations. + +00:08:42 Absolutely. I couldn't agree more. And so we're trying to blend the two. And that's been one of + +00:08:48 the exciting things. And you asked why I changed. I was excited about this opportunity to build these + +00:08:52 new majors from the ground up. We'll be one of the first teaching colleges, small teaching colleges, + +00:08:57 to offer an artificial intelligence major. It's just been in the last five years that we've seen + +00:09:02 colleges start to offer an undergraduate major in artificial intelligence. And it's mostly been + +00:09:06 big name brand colleges like CMU that are offering these first artificial intelligence bachelor's + +00:09:14 degrees. But we get to be at the forefront of how do you fit that into a small liberal arts college + +00:09:19 and still make it career relevant and still have that firm liberal arts foundation. + +00:09:23 Are you concerned about letting all these hackers in on your network if you're teaching a cybersecurity? + +00:09:28 Well, I won't be teaching the cybersecurity courses. So I'm firmly just in the computer + +00:09:32 science education course, and we're hiring additional faculty in cybersecurity and in + +00:09:36 artificial intelligence. I'm more on the management side of those two programs. But, you know, look, + +00:09:41 it's more relevant than ever. When you look at actually the number of students going to college + +00:09:46 for cybersecurity, it's been exploding over the last decade. You know, I just talked about how + +00:09:52 artificial intelligence has just come about as a bachelor's degree. Cybersecurity was kind of there + +00:09:56 15 years ago. So cybersecurity started out as maybe you took a computer security course at the + +00:10:01 end of your computer science degree. Then it became a concentration you could do within some + +00:10:05 computer science degrees. And then in the 2010s, it broke out and became its own bachelor's degree. + +00:10:10 And so we're seeing the same thing right now with AI. It started out, you know, when you did + +00:10:14 undergrad computer science, maybe in the 90s, or like me and the OOs, you might have one course, + +00:10:19 like an intro to AI course that you have at the end of your bachelor's degree. Then it started to + +00:10:23 be something where you do a concentration at Champlain. I developed a concentration in + +00:10:27 artificial intelligence. Now, just in the last five years, we're seeing it break out from being + +00:10:31 just a part of a CS degree to being its whole own, you know, adjacent degree. So I think it's an + +00:10:38 exciting time for that. Of course, there's a hype cycle right now about AI. So a lot of colleges + +00:10:42 are jumping in and I think some of them are doing it the right way. And, you know, having that firm + +00:10:47 foundation in computer science and mathematics, so it's durable. And some of them are doing it in + +00:10:52 kind of a shallow way. We're trying to do it the right way. Yeah, that sounds great. And obviously + +00:10:56 cybersecurity is one of those things that's highly valuable. Nobody wants to be in the news + +00:11:01 for that reason, right? So companies are certainly looking for people with those skills. + +00:11:06 Let's talk one more thing about the university and not the AI focus, but just AI. One of the + +00:11:12 things I think is of all the places that's getting the most scrambled, changed, under pressure, + +00:11:19 whatever you call it, from AI, I think it's education in general. And I'm not talking just + +00:11:24 college. I'm talking, I don't know, like third grade. As soon as the kids can start using AI, + +00:11:28 they're like, this is like the calculator with a rocket booster on it. You know what I mean? Like + +00:11:32 this will solve the problems. And I think there's a really big challenge for you as universities to + +00:11:38 connect with students, keep academic integrity as well. But there's also a huge problem, I think, + +00:11:43 for the students to not let it undercut their education and end up going, well, all I know + +00:11:49 to do is to do ChatGPT. You're absolutely right. I mean, it's a huge challenge in computer science + +00:11:54 education. And I don't think the computer science education community has yet completely figured it + +00:12:00 out. Well, basically what it breaks down to is exactly what you said. Students are just using + +00:12:05 ChatGPT or GitHub Copilot to do their homework. And if you're in a first year or second year class, + +00:12:12 I've been mostly teaching upper level CS for the past like six plus years. And I'm teaching an + +00:12:17 intro class for the first time in a long time this semester. And it's been eye-opening for me + +00:12:22 to see how many students are trying to just do every assignment with ChatGPT. And we still have + +00:12:28 to give them basic assignments. When you're first learning how to do a for loop or what a function + +00:12:33 is or what an if statement is even, right? You got to write some of those things. + +00:12:36 You can't say implement a database and get back to me next week. You got to start somewhere. + +00:12:40 So we still have to teach these fundamentals, but we have this opponent to us in some sense + +00:12:45 of this ease of access to something that can just do all the work for you. And so I'm sure that, + +00:12:51 you know, like you mentioned, mathematics educators had similar challenges in the 1970s, + +00:12:55 1980s as calculators became prominent. So what I've done and I've been adjusting as since chat + +00:13:02 GPT came out in fall 2022, I've been constantly adjusting and reevaluating, but I have had to go + +00:13:08 back to the future a little bit and people might find this a little anachronistic, but I've heard + +00:13:12 about it from other CS educators as well, going back to doing some paper exams. I know that sounds + +00:13:18 crazy and I know that sounds like bizarre, but at some point you need to evaluate if students + +00:13:24 actually know how to write a for loop because, you know, while we think, some people think LLMs are + +00:13:29 going to replace software engineers in the next two years, you still need to understand what it's + +00:13:33 outputting. And I don't think it's, I don't know about you, Michael, but I don't think they're + +00:13:36 completely replacing software engineers in the next couple of years. And that's coming, I don't + +00:13:41 know how many people maybe don't necessarily. I'm a huge fan of agentic coding and what it can do + +00:13:47 for productivity. And it's incredibly powerful, but it's one of those things that it needs someone to + +00:13:53 guide it who knows how to do that. And then it becomes a superpower. If you don't, you end up + +00:13:58 with like, how do we end up on React? I thought this was a Rust project. You're like, what happened + +00:14:02 here? Yeah. And you need to understand when it makes mistakes, you need to know how to correct + +00:14:06 those mistakes. And of course, you need to be understanding everything that it's outputting, + +00:14:11 so you're auditing it. This portion of Talk Python To Me is brought to you by Sentry's Seer. + +00:14:18 I'm excited to share a new tool from Sentry, Seer. Seer is your AI-driven pair programmer that finds, + +00:14:25 diagnoses, and fixes code issues in your Python app faster than ever. If you're already using + +00:14:31 Sentry, you are already using Sentry, right? Then using Seer is as simple as enabling a feature on + +00:14:37 your already existing project. Seer taps into all the rich context Sentry has about an error. + +00:14:43 Stack traces, logs, commit history, performance data, essentially everything. Then it employs its + +00:14:48 agentic AI code capabilities to figure out what is wrong. It's like having a senior developer + +00:14:54 pair programming with you on bug fixes. Seer then proposes a solution, generating a patch for your + +00:15:00 code and even opening a GitHub pull request. This leaves the developers in charge because it's up to + +00:15:05 them to actually approve the PR. But it can reduce the time from error detection to fix dramatically. + +00:15:12 Developers who've tried it found it can fix errors in one shot that would have taken them + +00:15:17 hours to debug. SEER boasts a 94.5% accuracy in identifying root causes. SEER also prioritizes + +00:15:25 actionable issues with an actionability score, so you know what to fix first. This transforms + +00:15:32 Sentry errors into actionable fixes, turning a pile of error reports into an ordered to-do list. + +00:15:38 If you could use an always-on-call AI agent to help track down errors and propose fixes before + +00:15:43 you even have time to read the notification, check out Sentry's Seer. Just visit talkpython.fm + +00:15:50 slash Seer, S-E-E-R. The link is in your podcast player's show notes. + +00:15:55 be sure to use our code talkpython one word all caps thank you dysentery for supporting talk + +00:16:01 pythonomy so we are having a real challenge especially in those intro classes of how do you + +00:16:08 kind of force students to not use these tools essentially because you're not learning anything + +00:16:13 if the tool writes the for loop for you when you're first learning how to do a for loop and so you have + +00:16:18 to find ways to to encourage it to to win hearts and minds i think of course that's a big part of it + +00:16:24 is convincing students and being dynamic and enthusiastic about how good it feels to really + +00:16:31 understand how this actually works. But then there has to be enforcement too. And sometimes that feels + +00:16:36 a little anachronistic being forceful about it or going back to doing tests on paper. But we have to + +00:16:41 have ways of ensuring the knowledge is actually there. I feel like this is not just a college + +00:16:47 student issue, but I think it's especially relevant in that part of your career that the struggle is + +00:16:52 not in the way. The struggle is often part of what unlocks your thinking. It's part of what + +00:16:59 cements the knowledge and makes you feel a true sense of accomplishment. When you're like, + +00:17:03 I tried this and I couldn't get it to work. But three hours later, I finally figured it out. And + +00:17:09 I now understand iterators. Finally. You know what I mean? And it's just so easy to push the easy + +00:17:15 button and just say, chat, why? Yeah. You know? And that's what feels good about being a teacher + +00:17:21 is being there for those aha moments with students. + +00:17:24 I had some moments like that last week and they reminded me, + +00:17:27 this is why you're in this career. + +00:17:31 It's really something that can become addictive to students actually. + +00:17:35 When they start having those aha moments, they want more of them and it spurs on. + +00:17:40 That's how you end up at like 3 a.m. + +00:17:42 really hungry, wondering why you haven't gone to sleep + +00:17:45 but still programming. + +00:17:45 You're like, this is amazing, I can't stop. + +00:17:48 Right, right, right. + +00:17:48 And it takes a certain mindset to be able to appreciate those moments. + +00:17:54 And, you know, this is like a sidebar, but one thing we're also seeing in computer science education is we see a lot of folks who go into it sometimes for the wrong reasons. + +00:18:03 Folks sometimes go into computer science just because they hear, this is a great way to get a good job. + +00:18:08 And if that's your only motivation going into it, you're probably not going to be successful in it, unfortunately. + +00:18:15 It's going to be tough. Yeah, it's going to be tough. + +00:18:17 But a lot of people are there for the right reasons. So I think that that's good. + +00:18:21 Absolutely. And I hope with a book like this, folks who came in from the other side, + +00:18:25 folks who came in because they had that interest, but they didn't have the chance to either go to + +00:18:31 university, maybe they couldn't afford it, maybe they studied something else and they're later in + +00:18:35 their career. This will hopefully give them a bunch of those aha moments and about topics that + +00:18:41 are deeper than just how to write a for loop. Yeah, absolutely. Well, good. I mean, I feel like + +00:18:46 there's probably some middle ground you might be able to accomplish if you're like, all right, + +00:18:51 we're going to give you a laptop, a testing laptop that has no internet. Like we've gone in there and + +00:18:56 crushed the network card. Yeah. Here, take the test. And here's your thumb drive to submit your + +00:19:01 work and potentially, you know, potentially. Yeah. Yeah. And when we do stuff like that, + +00:19:04 like I just did a test on, you know, an online test for my students and I'm just monitoring to + +00:19:10 make sure nobody's using the tools. You know, but there is something said still for paper exams. + +00:19:15 I'll tell you why. Sometimes even today on today's tools for giving exams, sometimes you want students + +00:19:19 to draw something, especially if it's a data structures and algorithms class, you might want + +00:19:23 them to draw a tree and it's just actually easier for them to do that on paper. So when people hear + +00:19:27 paper, they might be like, oh my goodness, what are you doing that for? No, there's real reasons. + +00:19:32 Yeah, of course. But yeah, we have monitoring tools too. Yeah, very good. All right. Let's maybe + +00:19:37 take a couple of different examples or different chapters in the book and talk through them. + +00:19:41 The first one, the first main topic is the smallest possible programming language, right? + +00:19:48 Yes, yes. + +00:19:49 Tell us about this. + +00:19:50 Yeah, the premise of the chapter is what's the minimum that we need to have a programming language? + +00:19:56 And there's a famous programming language. + +00:19:57 I'm actually not going to use the name of the language on the show, I think, just because it has the F word in it. + +00:20:02 And I didn't make up the name of the language. + +00:20:04 It was developed 30 years ago, but we'll just call it BF, okay? + +00:20:07 Yeah. + +00:20:09 Brain F star. + +00:20:10 Yeah, sure. + +00:20:11 or in AppStar. This language, it only has eight symbols in it. I mean, it literally only has eight + +00:20:17 symbols in it, yet it's what we call Turing complete. And I'm not going to, I won't go into + +00:20:22 the full details of what it means for something to be Turing complete, but let me put it this way. + +00:20:25 A language that is Turing complete can theoretically solve any of the same algorithmic + +00:20:31 problems as any other language that's Turing complete. And every programming language that + +00:20:35 you use is Turing complete, whether it's Java, Python, C, whatever, of course, they're all + +00:20:39 Turing complete. This language with only eight symbols in it is also Turing complete. So while + +00:20:44 you could code something like Quicksort in Python, you could also code it in BF with just eight + +00:20:50 symbols. While you could code a JSON parser in Python, you could also code a JSON parser in BF. + +00:20:56 So by learning this really, really basic language and actually implementing it, + +00:21:00 so implementing an interpreter for it, something that can actually run programs written in it, + +00:21:04 you really get to understand just how little we need to solve computational problems. + +00:21:09 We don't need much. + +00:21:10 We need very minimum amount of computing machinery and very minimum amount of syntax to be able to solve most problems. + +00:21:20 Now, would you want to solve most problems in PF? + +00:21:22 Of course not. + +00:21:23 We just use it as an illustrative example to show this is how simple a programming language can really be + +00:21:29 and still have all the same capabilities as a more advanced programming language. + +00:21:33 And you might wonder then why do we have such much more advanced programming languages? + +00:21:37 Because they give us a lot more developer productivity. + +00:21:39 They have more abstractions that let us think as human beings instead of thinking like machines. + +00:21:43 The expressiveness, absolutely. + +00:21:45 And so that's great that we have all of that. + +00:21:48 But if you really want to understand essentially what the language is doing at the lowest levels, you only need these few bits. + +00:21:55 So what's the key message here? + +00:21:57 Obviously, you're not trying to get people to become BF experts, right? + +00:22:00 I mean, maybe learn Cobalt over BF these days. + +00:22:05 It could work for the Social Security Administration. + +00:22:07 Yes, exactly. + +00:22:08 There's going to be some high-paying Cobalt jobs out there. + +00:22:11 But this is more about writing an actual interpreter, very much like CPython itself, in a sense, conceptually. + +00:22:19 Conceptually. + +00:22:20 I mean, much, much, much simpler. + +00:22:22 And actually, in the next chapter, we get to writing a basic interpreter, which is just one step up from BF. + +00:22:28 And we can talk about that in a minute. + +00:22:29 But yeah, it's about understanding how it works at a low level. + +00:22:34 Like what's it actually doing? + +00:22:35 And, you know, it feels good to make something that can run programs. + +00:22:39 A lot of people, when they get into computer science, are actually excited about like making their own language. + +00:22:45 So by doing these first couple of chapters of the book, you're actually on the path to that. + +00:22:48 I think after reading these first two chapters, you could go implement your own simple language and really make that kind of dream that a lot of us have come true. + +00:22:57 So it's about understanding how it works at a low level. + +00:22:59 I have to say the book is not a practical book. + +00:23:02 It's not like talk Python in production where you're going to learn, + +00:23:06 you know, here's some useful tools and tips and strategies. + +00:23:09 Install this thing and set up that config and then run it this way. + +00:23:12 To do something right now. + +00:23:13 Right. + +00:23:13 Computer Science from Scratch is not going to help you build your next app + +00:23:16 like that you're building. + +00:23:18 Not directly. + +00:23:19 Not directly, but it is going to help you understand + +00:23:22 a lot more about what's happening under the covers, + +00:23:25 which is ultimately going to make you think more broadly as a software developer + +00:23:29 and understand different strategies that you might be able to use. + +00:23:32 I'll give you an example with this interpreter stuff. + +00:23:35 You might be writing a program that needs to have some kind of configuration files, + +00:23:39 and you want to maybe be able to parse those configuration files. + +00:23:43 Well, part of writing an interpreter is writing a parser, + +00:23:46 something that understands the syntax of the programming languages + +00:23:50 and starts to get towards the semantics. + +00:23:52 And you might want to be able to write a parser later on for some very specific configuration type format that you've come up with + +00:23:58 for some, or maybe even just a file format for more of like an office type program. + +00:24:03 And you're going to need some way of understanding the techniques of how to parse that. Learning how + +00:24:08 an interpreter works will help you write that program later on. So it's about learning computational + +00:24:12 techniques, learning problem solving techniques more than it is about like something that's going to + +00:24:17 necessarily be, this is exactly how you do this for the next app you're going to build. + +00:24:20 It's not something that you would just stumble across most of the time, I think, right? You'll + +00:24:28 across, you know, oh, here's an ML library or something, but you don't typically stumble across + +00:24:33 and here's how you build a parser from scratch. Right. And so, you know, there has to be that + +00:24:38 curiosity there. So I will admit that, you know, if you have no interest and you're not the type + +00:24:43 of person who wants to understand how things work, you won't like this book. And you know, + +00:24:46 that is some folks and that's okay. There's folks who go into programming because they're only + +00:24:50 interested in what they can build and they're not so interested in how things work. And if that's + +00:24:55 you, this is probably not the book for you. But if you're the type of person who has that curiosity + +00:24:59 and you really want to understand how everything's actually working under the covers, then this is a + +00:25:04 great book. Yeah, absolutely. And it's a simple enough thing that you can grasp the ideas pretty + +00:25:10 quickly with this BF language, right? It's not so complicated that you, you know, a day later, + +00:25:15 still trying to make the thing parse. Absolutely. I mean, what's crazy is to interpret BF, + +00:25:22 you only need about 30 lines of code, 30 lines of Python. + +00:25:26 And then you actually have something that can run any program written in this language. + +00:25:31 And to be clear, you have like something.bf files that you can put the language into, + +00:25:36 and then you say Python, this module, that file, and it'll run it as if it were an interpreter for that thing, right? + +00:25:43 Exactly. Yeah. + +00:25:44 You're literally implementing the whole programming language in like 30 lines of Python. + +00:25:48 And I think what's great about this too, is it takes away the feeling that everything is magic. + +00:25:54 That's another thing I love when people read also class computer science problems in Python + +00:25:58 is sometimes when you think about how these things work, like in that book, we cover the A-star algorithm, + +00:26:04 which is something that Google Maps work uses. + +00:26:06 When you think about Google Maps, it feels like magic when you use it. + +00:26:10 But actually there's really understandable, logical algorithms that are underneath the surface. + +00:26:16 It's the same thing here. + +00:26:17 Python itself might feel like magic to a lot of folks. + +00:26:19 But by the time you get through these first couple chapters, especially through the basic interpreter chapter, you'll start to be on the road to think, oh, you know what? + +00:26:26 I bet I could dive into the CPython source code with enough additional training and really understand it. + +00:26:32 It gives you that confidence that this is not just magic. + +00:26:35 You just got to look at the byte codes and look at it go. + +00:26:40 Yeah. + +00:26:40 Not to say there's not a lot more there, but it just gets you on that journey and makes you see it's not magic. + +00:26:44 Right. + +00:26:45 Well, it's that zero to one sort of gap that's the hardest to cross. + +00:26:49 Yeah. + +00:26:49 Yeah. + +00:26:50 You know, like the second language, second programming language you learn or the third + +00:26:55 or the fourth, they only get easier to learn, not harder to learn. + +00:26:58 Whereas, you know, maybe when you're first starting out and you're trying to get something + +00:27:01 to compile and it won't even run, you're like, oh my God, how am I going to do? + +00:27:04 I can't even learn this one. + +00:27:06 There's all these things I'm going to have to know. + +00:27:07 And it's really kind of upside down. + +00:27:09 Yeah, absolutely. + +00:27:10 And so, you know, when we think about understanding how Python itself works, I think the second + +00:27:15 chapter of the book about basic gets us, you know, a lot further along on that journey. + +00:27:19 because here this isn't just a made up. + +00:27:22 BF is, it's a language that's been around in computer science education for like 30 years, + +00:27:26 but it's not a real language that people actually used. + +00:27:29 And the second factor- + +00:27:30 Very few people ever said that they were BF programmers + +00:27:33 at a dinner party, right? + +00:27:35 Absolutely. + +00:27:35 But plenty of people said they were basic programmers. + +00:27:38 And it was the first programming language of a lot of people who grew up in the 70s, 80s, + +00:27:42 or even the 90s. + +00:27:42 It was my first programming language. + +00:27:44 And there was a dialect of basic called tiny basic that came out in the late 1970s. + +00:27:49 it was actually one of the first free software projects. And it would run on machines that just + +00:27:55 had two or four kilobytes of RAM. So that's why it was called Tiny Basic. I mean, it truly was + +00:28:01 tiny. And so we re-implement a dialect of Tiny Basic in chapter two. So this is re-implementing + +00:28:06 a real programming language that people actually used for real work in the late 1970s and up to the + +00:28:12 early 1980s. And it can run real programs from that period. So you could go download a program + +00:28:17 from the late 70s, we were missing like one or two features from the real language. + +00:28:22 But if you have a pro, not all programs use all those features, you could actually run + +00:28:25 it in this interpreter. + +00:28:26 So we go from the first language where it's really esoteric, educational, you know, weird + +00:28:31 language to the second chapter. + +00:28:32 This was a real thing. + +00:28:33 Yeah, absolutely. + +00:28:34 It was, you know, I'm just going to give a little shout out to Visual Basic. + +00:28:38 Yeah. + +00:28:38 I don't know. + +00:28:39 Did you ever do Visual Basic? + +00:28:40 I did. + +00:28:41 And I did a version on the Mac called Real Basic. + +00:28:43 So this is really esoteric. + +00:28:45 But anyone who used real basic, late 90s, I'm in your camp. + +00:28:48 There you go. + +00:28:50 I don't know if we even today still have something as approachable and productive as Visual Basic was. + +00:28:57 For people who haven't used it, you're like, there's just no way. + +00:28:59 There's no way that it's basic. + +00:29:01 But you would just get a visual thing. + +00:29:02 You would drag it together to build your, you know, here, I want to have, literally, you would drag over. + +00:29:07 Here's a web browser. + +00:29:08 Here's the address bar. + +00:29:09 Here's a go button. + +00:29:11 And then you would double click the go button and would create an event handler. + +00:29:14 and then you would go webbrowser.gototext.value or whatever. + +00:29:19 I mean, that was literally, you could do it in five minutes. + +00:29:21 You could like create something that is a functional web browser without much. + +00:29:25 It was incredible. + +00:29:26 And so, yeah, I'm just thinking back to a few things I've built with it. + +00:29:29 It was amazing. + +00:29:30 You know, Michael, a lot of people agree with you. + +00:29:32 There's a lot of articles that I've seen on blogs and stuff + +00:29:35 where people reminisce about Visual Basic. + +00:29:37 And I agree. + +00:29:38 I mean, for desktop app development, it was incredibly productive. + +00:29:42 I mean, I think it still rivals some of the tools we have for building web apps today. + +00:29:47 When you think about how easy it was to lay out a user interface. + +00:29:51 And for designers, it was great too, because designers didn't need to know how to code. + +00:29:55 And they could lay out the interface in the same way that it would really appear in the program, + +00:29:59 which is different from how designers work today, where they'll often do mock-ups. + +00:30:03 And the developer will have to take the mock-up and turn it into code. + +00:30:06 And so you kind of lose something there with the designer being able to have the final product + +00:30:12 in front of them as they're changing around how things look. + +00:30:15 So there were elements of it that were still missing today, I think. + +00:30:18 - Yeah, I think it really hasn't been matched. + +00:30:21 Windows forms from.NET kind of approached that, but it was still, itself was also + +00:30:25 a little bit more complicated. + +00:30:27 There was something special about that. + +00:30:29 And now, don't get me wrong, it's not like I'm saying we should just go back to it + +00:30:33 because the software we build is way more advanced, does a lot of other things, + +00:30:36 but there's just that lower area is just kind of missing. + +00:30:41 This portion of Talk Python To Me is brought to you by NordStellar. + +00:30:44 NordStellar is a threat exposure management platform from the Nord security family, + +00:30:48 the folks behind NordVPN that combines dark web intelligence, + +00:30:53 session hijacking prevention, brand abuse detection, and external attack service management. + +00:30:58 Keeping your team and your company secure is a daunting challenge. + +00:31:02 That's why you need NordStellar on your side. + +00:31:04 It's a comprehensive set of services, monitoring, and alerts + +00:31:08 to limit your exposure to breaches and attacks and act instantly if something does happen. + +00:31:14 Here's how it works. + +00:31:15 Nordstellar detects compromised employee and consumer credentials. + +00:31:19 It detects stolen authentication cookies found in InfoStealer logs and dark web sources + +00:31:25 and flags compromised devices, reducing MFA bypass ATOs without extra code in your app. + +00:31:31 Nordstellar scans the dark web for cyber threats targeting your company. + +00:31:36 It monitors forums, markets, ransomware blogs, and over 25,000 cybercrime telegram channels + +00:31:42 with alerting and searchable contexts you can route to Slack or your IRR tool. + +00:31:47 Nordstellar adds brand and domain protection. + +00:31:50 It detects cyber squats and lookalikes via visual, content similarity, and search transparency logs, + +00:31:56 plus broader brand abuse takedowns across the web, social, and app stores to cut the phishing risk for your users. + +00:32:03 They don't just alert you about impersonation. + +00:32:05 they file and manage the removals. + +00:32:08 Finally, NordStellar is developer-friendly. + +00:32:10 It's available as a platform and an API. + +00:32:14 No agents to install. + +00:32:15 If security is important to you and your organization, + +00:32:17 check out NordStellar. + +00:32:19 Visit talkpython.fm/nordstellar. + +00:32:21 The link is in your podcast player's show notes and on the episode page. + +00:32:24 Please use our link, talkpython.fm/nordstellar, so that they know that you heard about their service from us. + +00:32:31 And you know what time of year it is. + +00:32:32 It's late fall. + +00:32:34 That means Black Friday is in play as well. + +00:32:36 So the folks at Nord Stellar gave us a coupon, BlackFriday20. + +00:32:41 That's Black Friday, all one word, all caps, 20, two, zero. + +00:32:44 That grants you 20% off. + +00:32:45 So if you're going to sign up for them soon, go ahead and use BlackFriday20 as a code, + +00:32:50 and you might as well save 20%. + +00:32:52 It's good until December 10th, 2025. + +00:32:55 Thank you to the whole Nord security team for supporting Talk Python To Me. + +00:33:00 And I want to talk, sort of transition from that to something else really. + +00:33:03 We're looking at these two examples of the BF language interpreter and the basic interpreter. + +00:33:08 I hear that to really understand computer science, really to work on these things, I have to. + +00:33:14 I don't, it's not preferable. + +00:33:15 I have to do a language with pointers, malloc, free. + +00:33:20 I've got to. + +00:33:21 I've got to work at that level. + +00:33:22 I just won't understand anything. + +00:33:23 And Python, we don't really have those concepts. + +00:33:26 And the irony, I think, is Python has more pointers than C++ because there's like no stack at all. + +00:33:31 Not at all. + +00:33:32 Really? + +00:33:32 I mean, in the interpreter there is, but not in your writing. Even a number one is just a pointer. + +00:33:37 So Python's full of pointers, but not in the way that computer science thinks about it. + +00:33:41 What are your thoughts about that sort of tension? On one hand, you have this really + +00:33:46 understandable language talking about these ideas, but the computer is calling malloc on a page of + +00:33:52 memory and that's what's happening and they're not seeing it. Okay. So let me talk about it from a + +00:33:57 pedagogy standpoint. So at my last institution, Champlain College, we had a big debate over my + +00:34:02 nine years there. Should we do our first three classes in Python or in C++? And when I came in, + +00:34:08 the first three classes were in C++. And we actually decided over the years to keep it there + +00:34:13 for exactly what you mentioned. We wanted to give students both that high level experience with + +00:34:17 object-oriented programming, but we also wanted them to have experience with pointers, with memory + +00:34:21 management, and understand how things work at a low level. But in that same period of time, many + +00:34:26 schools have moved to Python because of the other thing we talked about, which is accessibility. + +00:34:31 I think Python simply is an easier language to learn than C++. I don't think that most people + +00:34:37 who know both languages would really debate that. And so if we're trying to make those first ramp + +00:34:42 up classes where you're first learning CS as easy as possible, I think Python is the way to go if + +00:34:48 we want to encourage more people into the discipline. That doesn't mean there shouldn't + +00:34:51 Maybe a C or C++ course later on where folks, maybe when they take operating systems or even as part of an architecture class to see how the assembly language matches to a C program, that kind of thing. + +00:35:03 You know, it doesn't mean there shouldn't be C or C++ in the curriculum or Rust or whatever. + +00:35:07 But if we're thinking about what's best for a student who's just coming into the field, I think we need to think about accessibility. + +00:35:15 But at the same time, my advice to all students is learn one language well before you learn any other language. + +00:35:21 So whether you're starting with Python or you're starting with C++, spend a year or two on it and become really decent at it before you go and learn another language. + +00:35:31 And that's the biggest mistake I see folks who are self-taught make. + +00:35:35 The biggest mistake I see folks who are self-taught make is constantly switch around from language to language. + +00:35:39 I need to know this. + +00:35:41 Okay, I got it. + +00:35:41 Now it's time to learn this. + +00:35:43 And they're trying to fill all these gaps, right? + +00:35:45 Right, right. + +00:35:46 Because once you learn Python well, a lot of the stuff in C or C++ will make sense to you. + +00:35:51 The pointers might not, but a lot of the other stuff, like how does a function work? + +00:35:55 How does a loop work? + +00:35:56 What are variables? + +00:35:57 What's a global versus local? + +00:35:58 All that kind of stuff. + +00:36:00 That's going to make sense to you. + +00:36:01 And that's going to be transferable skills once you've really learned it in any one language + +00:36:05 across any other language. + +00:36:06 And then pointers, you can learn later on. + +00:36:08 You don't need to learn that at the beginning of your CS education. + +00:36:12 People make it sound like it's a totally mystical topic. + +00:36:16 If you are able to do calculus, which basically every CS degree requires calculus one, you can learn pointers. + +00:36:23 You'll be okay. + +00:36:24 You can learn pointers. + +00:36:26 Yeah, exactly. + +00:36:27 Like double integrals and all that kind of stuff is way worse. + +00:36:30 I feel like you could also cycle, right? + +00:36:32 So you could start with Python for a couple of classes, then go deeper, closer to the machine with C. + +00:36:38 But then you could come back and say, let's look at Python again with new eyes and try to understand interpreted dynamic languages better. + +00:36:45 because now you can take the red pill and you can see the arenas + +00:36:51 and the blocks of the memory allocator and the GC and all that kind of stuff, right? + +00:36:56 You could go, actually, you didn't know any of this stuff. + +00:36:59 You didn't, nobody probably even thought to think is this here + +00:37:02 and yet look at what's underneath that you're taking, + +00:37:05 you're building on top of, right? + +00:37:07 Yeah, and also remember how long an academic semester is. + +00:37:10 It's 15 weeks. + +00:37:11 So if you're taking a class in college, you're forced to be doing that same language + +00:37:15 for 15 weeks. That's really the challenge for self-taught folks is they could just spend two + +00:37:19 days on one, two days on another, right? But absolutely, you're right. Once you've had enough + +00:37:23 time in one, you can cycle to the others back and forth. Yeah, fun. I personally am a fan of + +00:37:27 it being in Python because I feel like one of the biggest challenges to keeping people in computer + +00:37:33 science and programming is that they don't get enough early wins and early like, yes, aha, right? + +00:37:40 It's like, okay, in week 12, we'll let you write a program, but we're going to talk pointers for a while. + +00:37:46 And then you'll finally get to make some, you know what I mean? + +00:37:48 Like it's just so delayed and that's fine for certain people, but there's a lot of people are like, oh, forget this. + +00:37:54 This is not what I thought. + +00:37:55 I'm out. + +00:37:55 And a huge part of that is the Python library ecosystem. + +00:37:59 How easy it is to drop Pygame into an intro class and get somebody building a really simple game. + +00:38:06 You know, it's much harder in C or C++ to start integrating libraries and start having students understand how to use those libraries. + +00:38:14 And it usually requires a lot more knowledge buildup to be able to use pointers and stuff with those libraries. + +00:38:20 So Python just makes everything more accessible. + +00:38:22 Awesome. + +00:38:23 Cool. + +00:38:23 All right. + +00:38:24 The next area that was pretty interesting and a little bit, maybe a little artsy. + +00:38:31 Yeah. + +00:38:31 Is the computational art. + +00:38:33 Tell us about that. + +00:38:33 Yeah. So there's two chapters in the book on computational art. The first one is really + +00:38:39 starting to understand what a pixel is on the screen. And the way we do that is we take modern + +00:38:44 photos and then we want to display them on like an ancient Mac, a Mac from the 1980s, + +00:38:51 like the Mac Plus. And so we're going to take that modern photo and use what's called a dithering + +00:38:56 algorithm to break it down into patterns of black and white pixels because those early Macs were + +00:39:02 only black and white, that will then still kind of look like the photo in black and white with + +00:39:06 these specialized patterns. So you're learning a bunch of things by doing this. One thing you're + +00:39:11 learning is you're learning really what is a pixel. And a pixel is really pretty simple at its base. + +00:39:15 I mean, it's just a color and a location. And so understanding how those pixels are organized + +00:39:20 and really they're just organized usually as an array or, you know, as a list in Python or a + +00:39:26 numpy array or whatever. And then we're understanding some algorithmic stuff. So + +00:39:32 is giving us some algorithmic practice. + +00:39:35 And then a cool thing we do at the end is we actually convert it into a file format + +00:39:38 called Mac Paint for displaying on an ancient Mac. + +00:39:42 And that Mac Paint format uses a very simple compression scheme + +00:39:47 called run length encoding. + +00:39:49 So we're getting some other kind of algorithmic practice in there as well, + +00:39:52 understanding a simple compression algorithm. + +00:39:55 And we're understanding something about file formats too, + +00:39:57 which is really kind of an interesting CS topic as well. + +00:40:00 Yeah, it sure is, yeah. + +00:40:01 to properly format the file so the Mac Paint program on the ancient Mac will open it correctly. + +00:40:06 So yeah, that was something actually when I first got into programming, I feel like I just stuck + +00:40:11 with text oriented files pretty much as long as I could. Because, you know, looking at a binary file, + +00:40:17 like, okay, it has a header and we read the first few bytes and then the value of that byte tells + +00:40:22 you how long the rest of the header is and then like what this means. And then you skip. And I + +00:40:26 don't know. That was just, it was a bridge too far for me in my early programming days. And I was + +00:40:31 like, wow, this is intense. And I was in awe of people like, yeah, we just read the header and we + +00:40:35 do this. I'm like, okay, if you say so. You know, it's a classic CS trade-off between time and space. + +00:40:43 Anything that we do in a binary file, we could do in a text file. But binary files can be more + +00:40:48 efficient because they can be more compact and they can be faster to read from for certain kinds + +00:40:53 of data. So, you know, it's not that we have to use binary files, but understanding what a binary + +00:41:00 file format is like can be eye-opening to some readers. And if you think about like modern file + +00:41:05 formats, of course, text formats are much more explainable, right? That's why we have the rise + +00:41:09 of things like XML in the late nineties up to today, or even JSON as a data interchange format. + +00:41:15 There's alternatives to JSON, of course, that are much more efficient for coding your web app in + +00:41:20 for distributing the data back and forth. But JSON is human readable. Yeah, yeah, yeah. But JSON is + +00:41:25 human readable. So we can right away understand what it's supposed to represent and debug it really + +00:41:31 well. And so it's, you know, it's one of those classic trade-offs. We can have something more + +00:41:34 efficient or we can have something that might take up a little bit more time, but actually, + +00:41:40 you know, is better for us as human beings. Yeah. It's just, I think it's a good skill. + +00:41:45 And also something that was prevalent throughout the book is juggling bits, bits and bytes, not just juggling bytes, but bits and bit flipping and shifting. + +00:41:54 And there's a lot of that going on, especially in the emulator layers and stuff like that. + +00:41:59 Yeah. And even in the Mac paint chapter, because the way Mac paint stores pixels is as individual bits. + +00:42:05 So you have like a one or a zero representing a black or a white pixel on the screen. + +00:42:09 And so you have to find a way to take those bits and compact them into bytes and then run the right run length encoding algorithm compress it. It's yeah, but there's a lot of that. And, you know, when you do really low level work in computing, you need to understand bits and bytes. + +00:42:23 If you're going to work on device drivers or operating systems or file formats, you really need to understand this stuff at an intimate level. + +00:42:31 So we try to make fun projects in the book that you get something that's interesting as a way of making this a little of sugar to help the medicine go down kind of thing. + +00:42:43 And we were talking about the computer graphics chapters. + +00:42:45 I also wanted to mention chapter four because I love the program in that called Impressionist. + +00:42:50 It makes images that look like an impressionist painter painted a photograph. + +00:42:56 So you give it a photograph and then it builds kind of abstract art out of that photograph. + +00:43:01 And people usually think you need a neural network for that or you need some kind of really advanced machine learning algorithm for that. + +00:43:07 But actually, we show in the chapter that you can do it using a pretty simple algorithm. + +00:43:11 All the algorithm does is it puts a vector shape on the screen and it tries to position the vector shape so that it overlaps a region of color on the original photo that is close to the color in the vector shape. + +00:43:23 And if you keep doing that and you put enough vector shapes on the screen, you start to have like abstract shapes that look like the original photo. + +00:43:30 And so I think that's that chapter is kind of powerful because it shows you how a simple technique, a simple computational technique can really have some pretty powerful output. + +00:43:38 it. Yeah, it's a really interesting idea and it comes out looking great. It's sort of, + +00:43:42 it's approaching the problem from a different perspective, which I suspect is probably a pretty + +00:43:46 interesting CS lesson. You know, there's these problems that are incredibly expensive and + +00:43:52 difficult to compute the one true answer, but then there's amazingly fast ways to get a, + +00:43:58 that's pretty much it, answer. I'm thinking like Monte Carlo simulations and stuff like that. + +00:44:02 You're like, this could take two weeks or three milliseconds, which would you prefer? You're + +00:44:08 And, you know, that that's we talked about, like, why is the book in Python? + +00:44:12 And, you know, that is one of the challenges of writing the book in Python is simply Python doesn't have great performance when you write in pure Python. + +00:44:21 And so, you know, we've all seen the benchmarks where Python's like 50 or 70 times slower than C on certain benchmarks. + +00:44:28 Right. And yeah, honestly, for some of the programs in this book, you really see that performance deficit you have with Python, like the abstract art chapter, + +00:44:38 where if I wrote that same program in C using some C graphics library instead of in Python + +00:44:43 with Pillow, even though Pillow, I think, is mostly implemented in C anyway, but still just + +00:44:47 with the overhead of having our algorithmic part of it in Python, that program is probably 30, + +00:44:54 50 times slower than it would be in C. So you have to wait like 20 minutes to see that abstract art + +00:44:59 that would have been like less than a minute in a C program. Not that it's relevant to the book or + +00:45:03 of the courses, but you probably could bring in some optimizations like Cython or Numba, + +00:45:12 a couple of the things that'll take just the inner two loops and just make them go different, + +00:45:18 but it's not the point of the book. That's true. And so I put that as an exercise for the reader + +00:45:23 in the NES emulator chapter, because the NES emulator chapter where we actually build up + +00:45:28 a real NES emulator. So I didn't write in the book for legal reasons, but it can actually play + +00:45:32 Donkey Kong. So it can play Donkey Kong from the original Nintendo Entertainment System. + +00:45:36 Let's talk about the NES, the Nintendo Entertainment System. + +00:45:40 Yeah. So, I mean, the NES, a lot of us remember growing up with, and if you're of a younger + +00:45:45 generation, this was like the main video game system of the 1980s to early 1990s that everyone + +00:45:51 had. It's where the original Super Mario came out, the original Legend of Zelda. And, you know, + +00:45:57 it's actually, of course, like all video game systems, it's a computer. And being a video game + +00:46:01 system for the 1980s. It's a pretty simple computer, actually. It has a 6502 microprocessor, + +00:46:07 which is the same microprocessor that was in the Apple II or the Commodore 64 or the TRS, + +00:46:12 I think the TRS-80 also. And that microprocessor, it only has 56 instructions, the NES version of it. + +00:46:21 And so you can write an interpreter for that microprocessor pretty compactly, + +00:46:26 not quite as compact as the BF interpreter in chapter one, but compactly enough that it's only + +00:46:31 lines of Python to be able to really write a simulator of that CPU. And so the NES is just + +00:46:37 that CPU plus a graphics processor called the picture processing unit plus an audio processor. + +00:46:43 So in the chapter, we implement the full CPU. We implement a simplified version of the PPU, + +00:46:49 not advanced enough to run Super Mario Brothers, but it is advanced enough to run Donkey Kong. + +00:46:54 And we don't implement the APU. So the audio processing unit is a little more complicated + +00:46:59 We don't do the audio. + +00:47:00 But we do all of this in the chapter so you can run real NES games. + +00:47:04 And what you're getting is you're getting to understand that software hardware interface. + +00:47:08 You're starting to understand how does our software actually get run on hardware + +00:47:14 by re-implementing what a CPU actually does. + +00:47:18 And you're re-implementing the memory map too. + +00:47:19 So how does a CPU connect to memory? + +00:47:21 How does a CPU connect to the graphics processor? + +00:47:24 You're doing all of that in that chapter. + +00:47:26 And so it's kind of like- + +00:47:27 Sure. You talked about like writing an interpreter for the BF in basic languages. Here you're + +00:47:34 completely emulating the hardware of the NES and you're giving it the potentially given it + +00:47:42 open source NES games and it can run them, right? + +00:47:45 Correct. Yeah. It runs several NES open source games. Like I said, I didn't put it for legal + +00:47:50 reasons, but it can run real commercial games as well for the NES. And so you're doing the whole + +00:47:54 soup to nuts like entire system except the audio. But it is Python and it's pure Python, except for, + +00:48:00 of course, the library we're using. We're using Pygame for displaying the window, which is written + +00:48:05 a lot of in C. But because it's pure Python, the emulator doesn't run at full speed. So it runs on + +00:48:12 my Mac. It runs at about 15 frames per second. The real NES ran at 60 frames per second. So we leave + +00:48:18 as an exercise to the reader. Yeah, go use Cython or something like that. And I'm sure you can get + +00:48:24 up with several different techniques, not just with, definitely with Cython, + +00:48:27 but with several different techniques, you could get this up to 60 frames per second, + +00:48:29 but you're going to have to incorporate something that is, you know, using, + +00:48:35 that gets outside of just pure Python. + +00:48:37 Sure. + +00:48:38 You know, one, there's a bunch of ways that you already mentioned, + +00:48:40 but one new way that's kind of officially new, just this month is free-threaded Python. + +00:48:46 And I've been doing, I went back to my async programming course + +00:48:50 where there's a lot of examples of like, let's do this synchronously. + +00:48:53 Now let's do it with threads. + +00:48:54 Let's do it with multiprocessing. + +00:48:55 Now let's do it with asyncio, right? + +00:48:57 And see like how those all compare. + +00:48:58 And the threads one, anything was computational. + +00:49:01 It's like, yeah, it's the same speed or maybe even slower, right? + +00:49:03 But then I ran it with, I did uv run --Python 3.14 T and it went 10 times faster + +00:49:12 or eight times faster on my 10 core machine. + +00:49:14 And it's literally just running on the, I didn't change the codes from pre-threaded Python + +00:49:19 and it just took off. + +00:49:20 So I'm thinking here there's graphics amongst many things is primed for this like embarrassingly parallel processing. + +00:49:29 Yeah. + +00:49:30 You could break up the screen into like little chunks and go, well, we got 10 chunks and 10 cores. + +00:49:35 Let's go at it. + +00:49:36 What do you think about it as a plausibility? + +00:49:39 Unfortunately, for a lot of emulators, especially emulators that are for more modern systems, threading would be a huge advantage. + +00:49:46 For the NES, it's not because of the way that everything was timed. + +00:49:51 So every time that the CPU does one cycle, the PPU, the picture processing unit, has to be timed to do exactly three cycles. + +00:50:00 So you can't just go ahead and, you know, and you don't just give it work. + +00:50:04 It's not like a modern GPU where you just give it a bunch of chunks of work. + +00:50:07 It works on its own and then provides that to the screen. + +00:50:10 It's completely synchronized to the CPU. + +00:50:13 And so is the audio processing unit, too. + +00:50:15 So it all has to kind of be synchronized within a single thread. + +00:50:18 And all the games assume that that's how it works, right? + +00:50:20 That's how they have it all set. + +00:50:22 Yeah. + +00:50:22 Yeah. + +00:50:22 Yeah. + +00:50:23 And what they had, what was called a V blank period, which was a period where it had already + +00:50:28 drawn the screen and then it actually tells the CPU, hey, I'm done drawing the screen. + +00:50:32 Now you can do all kinds of work and be ready for the next time that you draw the next frame + +00:50:35 again. + +00:50:36 So they had really tight synchronization between the graphics and the CPU. + +00:50:40 Interesting. + +00:50:41 Still cool. + +00:50:42 Still pretty cool. + +00:50:43 There's a lot going on there. + +00:50:44 If you want to understand hardware, that's a pretty low level look at it, right? + +00:50:48 I think it's kind of the crown jewel of the book. + +00:50:50 You know, there's a lot of people who are interested in emulators. + +00:50:54 And what I found before writing the book is there's not a single book out there that talks + +00:51:00 about writing an NES emulator, which is the most common emulator that people want to write. + +00:51:04 So I think this is the first book that has a chapter on writing an NES emulator. + +00:51:08 I will admit I was a little scared of like if Nintendo's legal team is going to like + +00:51:12 be upset about the book or something like that. + +00:51:14 but we did research on it. + +00:51:15 I mean, everything we're doing is perfectly legal in the book in the United States, at least. + +00:51:20 So, you know, but anyway. + +00:51:22 Scary, right? + +00:51:23 Yeah, absolutely. + +00:51:24 And we put a big disclaimer at the top of the chapter as you remember reading. + +00:51:28 I have a legal, I have a Nintendo story for you. + +00:51:32 Great. + +00:51:32 From when I was young. + +00:51:34 And this was, I think this is very late 80s, early 90s, last century. + +00:51:40 Now, this was not me, someone I knew, someone I was friends with had a, + +00:51:45 I think it was a super NES. + +00:51:47 Okay. + +00:51:47 And people may not remember, but they had these big honking cartridges go chonk. + +00:51:51 And you would chunk them down in there. + +00:51:53 You would turn on, you would basically reboot the NES and it would boot up from the, + +00:51:58 the big cartridge game cartridge you put in there. + +00:52:00 And there was this guy that they've, they found somehow I don't, + +00:52:05 cause there's no internet. + +00:52:06 Maybe it's through a BBS. + +00:52:08 I don't know how this was found, but they found this guy who built a thing that the, + +00:52:12 bottom half of it looked like a game cartridge. The top was this whole computer that at the top of it, + +00:52:19 it had a 3.5 inch floppy drive. Wow. So you would go to your BBS, you would find a game, + +00:52:25 you would put it on a floppy, and then you would slot this huge extra computer down into the NES + +00:52:31 and you would turn it on and it would like delegate the file IO through the little fake cartridge + +00:52:37 back to it. And it worked like crazy. It was perfect. That guy was not in business for very + +00:52:41 long the guy who sold them not my not my friend but the person he got it from wasn't available + +00:52:46 after a while so this is so super nes comes out in like 91 um so this might have been a regular + +00:52:53 no i wasn't i think i might have been in college so yeah it would be it would be 91 onward yeah + +00:52:58 okay i mean that is 93 that is very advanced like for that time when you think about the type of + +00:53:03 stuff people like were modding at that time that guy was a genius yeah he was he should have been + +00:53:08 hired if he wasn't you know what i mean yeah yeah absolutely yeah he was building himself out of his + +00:53:14 dorm room or something i don't know it was i think it was in uh manhattan in uh kansas state + +00:53:18 university i'm not sure where the guy was but wow anyway yeah that guy was playing with fire + +00:53:24 yeah absolutely and you know what um sometimes like understanding how the rom cartridge works + +00:53:31 right like we think again that it's like all magic right but actually what those big cartridges that + +00:53:36 we had, that we like inserted in the NES were is they were mostly plastic, like that did nothing. + +00:53:41 And there was a little ROM chip, like inside the big hunk of plastic and the ROM chip, + +00:53:46 ROM just stands for read only memory. So most of them were just like a big chunk of memory + +00:53:51 with a very tiny bit of logic chips that were called mappers on the NES that said like, + +00:53:55 read this memory at this time. But again, it wasn't really magic. And those cartridges look + +00:53:59 so intimidating, but they were really just a big piece of memory. Yeah. Think about how much you've + +00:54:04 got to make that work right before you ship it. Yeah. There's no patches. There's no download and + +00:54:10 update. Once it's shipped, it's shipped and it's burned into the ROM and that's it. + +00:54:14 I have some friends who work in game development and like before they even ship the 1.0, + +00:54:20 nowadays they're working on the first patch, like, and they just know they're not going to ship like + +00:54:24 the perfect program in 1.0. Now, to be fair, games were much simpler in the 1980s, right? + +00:54:30 than they are today, but they had to be perfect, like literally had to be perfect. and so + +00:54:35 there was a certain, I think different attitude around, game development then, than there is + +00:54:41 today. And remember they were also working in assembly language. so it was easy language + +00:54:46 to work. I don't know if people like this word, but I would call it kind of hardcore, writing + +00:54:51 NES games in the 1980s. It was, you know, that kind of programming it's, it's just, it was just + +00:54:57 different. It was just different. You had to be so detail oriented and you had to be so thorough. + +00:55:03 It's almost like writing spaceship control software type of thing. Not quite, but almost. + +00:55:08 And the machines were so slow. The NES, which we emulate in chapter six, ran on a 1.8 megahertz + +00:55:16 CPU, 6502, 1.8 megahertz. So Super Mario was like an incredible accomplishment on a 1.8 megahertz. + +00:55:25 So people had to worry about all these computer science topics in a way that they don't today as programmers, because you had to squeeze every last bit of algorithmic performance out of the machine. + +00:55:36 And so today, I think we have sloppier software today because we have an embarrassment of riches. + +00:55:42 We got such powerful computers with so much memory that people don't worry about writing things as efficiently as possible. + +00:55:48 And so we end up with inefficient software sometimes because people don't bother to do the algorithms right. + +00:55:55 Yes, but I'm going to put a little double dagger, like see the footnote by your comment. + +00:56:00 Absolutely true for software, not necessarily true for software developers. + +00:56:05 Agreed. + +00:56:05 The efficiency of writing a 99.9% correct Python program, how quickly that gets done versus + +00:56:13 like something in assembly language straight on a ROM. + +00:56:16 Yep. + +00:56:17 You know what I mean? + +00:56:17 These are the, I think this is an interesting arc and maybe you could speak to it a little + +00:56:22 bit from your academic perspective, but I think it's a really interesting arc of, it used to be + +00:56:27 really hard to program and we'd solve small problems with lots of effort and it's getting + +00:56:32 easier. And a lot of times it gets easier. People say, well, there it goes, job's over. There's not + +00:56:37 going to be programmers anymore because everyone can do it. And we just solve bigger, more complex + +00:56:41 problems. And that just keeps building, you know, like the, a web app, a basic e-commerce app from + +00:56:47 today was a keynote presentation of how like a fortune 500 company pulled it off in 1990 you + +00:56:55 know what i mean like yeah yeah here's how we got the ssl termination special machines to handle the + +00:57:00 ssl decryption so that we could do this under two seconds of request i mean like crazy stuff yeah + +00:57:06 what do you think about this well it's not an either or i mean we want developers to be productive + +00:57:10 and we also want efficient software so i mean the good news is that means that after we have make it + +00:57:15 it easier and easier to build software, we still need folks who are going to work on optimizing that + +00:57:19 software after it's built. So there's still jobs for software developers who are interested in that. + +00:57:24 I think you actually touched on this in talk Python in production to some degree, because you + +00:57:28 talk about in the book, the appropriate level of complexity for the type of app that you're + +00:57:33 building. And so you talk about like, you know, maybe you don't need a Kubernetes cluster if + +00:57:38 you're just building, you know, something for 10 million requests a month and not something for a + +00:57:43 million requests a month. And it's going to be run by one person or one person part-time versus a + +00:57:48 team or somebody who's a DevOps expert, right? Right, right, right. So if I'm working on a run + +00:57:55 of the mill e-commerce thing, maybe I don't worry about squeezing out every last bit of performance + +00:58:00 because I want to productively make something that, like you said, is 99.9% of the way there + +00:58:04 as quickly as possible. But if I'm building a 3D game, they still build most 3D games in C++ + +00:58:10 because they need to squeeze out every last bit of performance, + +00:58:14 even if it's a little less efficient for the programmers. + +00:58:17 The programmers could write the games faster, maybe in another language or another framework + +00:58:21 than they're using, whatever. + +00:58:22 But they use something that squeezes out every last bit of performance. + +00:58:25 So I think you have to think about what is the app that we're building? + +00:58:28 What is the domain that we're in? + +00:58:30 And it's also just not an either or. + +00:58:32 Like, you know, we want to have software that's both developer productive to build, + +00:58:37 but also that's efficient to run. + +00:58:39 And it's an ongoing process. + +00:58:40 We might need to make that trade-off more in the way when we're first building it in developer + +00:58:44 productivity. And then as we're maintaining it, maybe we go more towards the efficiency side. + +00:58:48 But if you don't think about some of the efficiency things up front, you can end up in a bad spot + +00:58:53 because you can get some technical debt. You could end up in a situation where it's hard to undo + +00:58:59 some of the algorithmic mistakes that you made early on and the system starts to depend on. + +00:59:03 So you should think about it at least a little bit. + +00:59:05 Yeah, absolutely. And I think you're talking about they've got to squeeze out that little + +00:59:09 extra bit of performance because they want high frame rates they might have to squeeze out that + +00:59:12 little bit extra performance because it's either possible or not possible you know you look at some + +00:59:17 of the stuff in the unreal engine these days you're like that's real time that's the game + +00:59:23 that looks like a rendered cgi movie not a game yeah yeah yeah yeah yeah and i mean so a lot of it + +00:59:30 is being done for us today by library authors right so library authors are thinking about a + +00:59:34 lot of the low-level stuff. So us as like run-of-the-mill developers don't need to, which is + +00:59:39 great. But if you're doing something really de novo, something, you know, with an innovative + +00:59:44 algorithm, you still need to consider efficiency at least. Right. And the other thing is it, + +00:59:50 you can write really slow code in C and you can write really fast code in Python or vice versa. + +00:59:55 It's easier to make it slow in Python, to be fair. But algorithms and data structures are + +01:00:01 tremendous influences, right? If you're using, you should have been using a set or a dictionary and + +01:00:05 you're using a list. Yeah. You're probably having a real bad time performance. And you caught me + +01:00:10 multiple times on the technical review in places where I should have used a set and I used a list. + +01:00:15 So thank you for that. But, but that's absolutely, that's a great example. I mean, + +01:00:19 just knowing that, that basic fact that, you know, in this situation, just swapping out, + +01:00:25 which is a simple switch, by the way, you know, one data structure for another can totally change + +01:00:29 performance characteristics. That's the type of things you do learn in the CS degree that you + +01:00:34 don't necessarily learn when you're a self-taught programmer kind of hacking everything out on your + +01:00:38 own, which is why a book like classic computer science problems in Python, or maybe computer + +01:00:42 science from scratch is good for that. Yeah. It exposes you to this. Here's a data structure you + +01:00:47 haven't thought about, but here's why we're using it and the advantages and stuff. Yeah, absolutely. + +01:00:51 Well, we've covered a lot. We've covered pretty much solved computer science in 2025. That's good. + +01:00:57 And then the book is bringing computer science to so many people who are self-taught, you know, like me. + +01:01:04 I went to college, but I studied many, many years of math and found my way into programming. + +01:01:09 So I took a few CS programs, but not so many. + +01:01:12 So, you know, it certainly showed me many things that I hadn't seen. + +01:01:15 Like I've never emulated hardware before, for example. + +01:01:18 That was wild. + +01:01:19 And, you know, Michael, I was reading in Talk Python in production that you actually worked on Windows desktop apps. + +01:01:26 Is that right? + +01:01:26 I did. + +01:01:27 Yeah, for 10 years, more than 10 years. + +01:01:29 I'm almost 15 years, yeah. + +01:01:31 Wow, so I'm curious, how did you feel about that experience + +01:01:36 versus the kind of Python GUI frameworks that exist today? + +01:01:40 You heard my rant about Visual Basic. + +01:01:42 Yeah, and I was thinking of that. + +01:01:44 I started actually in C++, starting the hard way, doing MFC. + +01:01:49 I don't know if you ever played with that or heard of that. + +01:01:51 Yeah, Microsoft Foundation classes. + +01:01:52 Yeah, yeah, and that was actually pretty decent for a C++ framework. + +01:01:55 And then as soon as I found Visual Basic and C#, I'm like, this is so much better. + +01:01:59 It goes from weeks to days of UI work, you know, and stuff like that. + +01:02:03 And it took me a while to really appreciate building for the web. + +01:02:08 You know, I think I probably made that switch around the year 2000. + +01:02:11 There's a little bit after that, but I really like the web these days. + +01:02:14 I think the web is special. + +01:02:15 I just wish it was easier to take apps from the web and get them to people. + +01:02:20 For example, Firefox canceled progressive web apps. + +01:02:24 iOS has them, but they're kind of, let's not talk about those. + +01:02:27 And if you know the secret, you can probably install one, but you probably shouldn't. + +01:02:30 And we're going to, you know, like if it's just, it's like right on the verge of one more + +01:02:35 step and the web would be really, a really good replacement for those. + +01:02:39 And now we have things like Electron and so on, which I'm not a huge fan of, but it's, + +01:02:44 it's kind of what we need to make that happen, but we don't necessarily need Electron, right? + +01:02:48 It would be really great. + +01:02:49 I'm also really excited. + +01:02:50 I don't know how you feel about, I'm really excited about PyScript. + +01:02:54 and the possibility of running Python for your front-end stuff? + +01:02:57 Well, I guess what I was getting at is why do you think Python has not taken off more + +01:03:03 for desktop GUI development? + +01:03:04 So like we've seen things like KIVI, of course we have PyQT and several other frameworks + +01:03:10 that wrap into older C++ GUI frameworks. + +01:03:13 But like Python now is so mature, it's the most popular language in the world. + +01:03:18 There's nothing preventing it. + +01:03:19 Yeah, there's nothing preventing it from making a really nice platform + +01:03:22 for native app development, right? + +01:03:26 Fundamentally, you could wrap, you could come up with a framework + +01:03:29 that abstracts talking to Objective-C, the APIs there, or the Win32 API. + +01:03:34 You know, one of the things I'm starting to realize that makes desktop development really different + +01:03:39 from back in the day, if you will, is there are so many gatekeepers and barriers + +01:03:44 to getting your app onto a machine, right? + +01:03:47 If I built a cool desktop app and I gave it to you, your Mac would go, no, + +01:03:52 we're not running that. We moved it to the trash for you because it wasn't notarized. + +01:03:56 Right. And something similar happens on Windows. And so there's these steps you got to jump through. + +01:04:03 And I think there's just been too many gotchas and steps for anybody to push a framework or a way of + +01:04:10 doing this all the way through. I mean, in a sense, the web is kind of like good enough that we don't + +01:04:16 have to figure out a way to build this. I think honestly, if I were to be more concerned, I'd be + +01:04:21 more concerned that we can't create truly straightforward mobile apps with Python than + +01:04:26 desktop apps. Yeah. What do you think? That's where I was going to go to next. Yeah. And I mean, + +01:04:30 Kivy has been an attempt at that. But, you know, I don't really think it's gotten a ton of track + +01:04:36 from what I see. It doesn't look like it's gotten a ton of traction in the way people in the Python + +01:04:41 community hoped it would. So, you know, I don't have to do good work, but it's also not there yet. + +01:04:47 I don't think. I wonder if some of the performance issues, you know, are part of this. So, you know, + +01:04:54 people expect one of the reasons we like desktop apps and native mobile apps over web apps is + +01:04:59 because we get instantaneous feedback and, you know, really high performance in our user interfaces. + +01:05:05 And, you know, I still find even a PyQT app sometimes a little bit slower than, you know, + +01:05:12 a regular QT app. And, you know, it's unfortunate. For me, like the big want for the whole Python + +01:05:19 ecosystem is what started to be the focus, I think, of the core developers, which is performance + +01:05:24 improvements. Like, I think that performance improvements would make everyone's lives in the + +01:05:28 ecosystem so much better, you know. And so I think rightly that this has become like one of the central + +01:05:35 focuses of, you know, of the core developers. Yeah. Guido, Mark Shannon, Grant Booker, + +01:05:42 a bunch of people whose names I'm not including, they've all done really good work over the last + +01:05:45 three or four years. I was a little bit sad to see Microsoft cancel that project or cancel the + +01:05:50 funding for that project. I mean, the project continues, but still, you know, what has happened + +01:05:55 is so, so much better. That is really a big deal. Yeah. Yeah. And, you know, I even saw it in the + +01:05:59 course of writing the book. I started writing the book in 2021 and that NES emulator on the same + +01:06:05 was something like 12 frames per second in 2021. + +01:06:10 And by 2025, it was like 17 frames or 15 or 17 frames per second. + +01:06:13 Yeah, nice. + +01:06:15 I really saw it in these computationally intensive programs in the book. + +01:06:19 I'm pretty positive for it. + +01:06:20 I mean, there's a lot of things that could be better, + +01:06:24 but I think one of the real superpowers is it's approachable, but it's ceiling of Python. + +01:06:29 That is, it's ceiling of what you can accomplish is not that low, right? + +01:06:33 You can go pretty far if you have CS skills and ideas. + +01:06:37 And then, you know, pip install, uv install. + +01:06:40 The options of what is out there to just build and click together are incredible. + +01:06:44 What do you think about, and I know you talked about it on the show before. + +01:06:47 I heard about a while ago on the show about Mojo and about, you know, a total attempt + +01:06:53 that, you know, let's just redo it and we'll get to keep the language syntax, but not the + +01:07:00 runtime. + +01:07:01 Right, right, right. + +01:07:01 And I mean, PyPy is also, of course, kind of an attempt at that as well. + +01:07:05 But and some people call for it, right? + +01:07:07 I see sometimes people are like, why don't they replace CPython with PyPy? + +01:07:10 What do you think about kind of just like the whole is too big a topic for the end? + +01:07:14 No, no, it's interesting. + +01:07:16 I think all of those are interesting. + +01:07:18 I think the Mojo performance story is very powerful. + +01:07:22 It's also really hard to bring over to CPython because there's so many different ways that it's used. + +01:07:28 There's so many, you know, it runs on this piece of hardware doing this thing that we just could never optimize for, you know? + +01:07:34 So, and then, like I said, with 600,000 or whatever there are packages, you know, how much of that are you willing to carve away to get a faster language? + +01:07:43 And what I think also is a really interesting aspect that people might not think about or take into account that often is you'll see a lot of these benchmarks. + +01:07:51 Like here's the three body, solving the three body problem in Python, and here's solving it in Mojo. + +01:07:56 Here's solving the three-body problem in Rust. + +01:07:58 And look at that huge difference. + +01:08:00 But what often happens in Python is you find yourself orchestrating native code anyway. + +01:08:05 Like, okay, we're going to use Polars and we're going to do this thing. + +01:08:08 But when I call the Polars functions, I'm no longer running Python. + +01:08:12 I'm running like a drop of Python and a bunch of Rust. + +01:08:15 And then you write back in the same, or I'm talking to a database layer, + +01:08:20 or my web app is running on a Rust-based server. + +01:08:23 There's just all these little parts where a lot of times your code speed + +01:08:29 is more about how you're putting the pieces together. + +01:08:31 Yeah. + +01:08:31 Not always. + +01:08:32 If you're doing computational stuff, that's out the window potentially. + +01:08:35 But when you're kind of, you know, you do machine learning, you're doing web apps, + +01:08:39 you're doing database calls. + +01:08:40 A lot of these are like just a layer and then off it goes. + +01:08:44 Yeah, that's totally makes sense. + +01:08:45 But then I think it is holding Python back from some of those, you know, + +01:08:50 some of these real interesting domains that people want to get into, + +01:08:54 especially when they're first learning programming, like 3D games. + +01:09:02 Let's stick with that because I'm thinking of ones for people + +01:09:04 in computer science education. + +01:09:06 A lot of people who study computer science is because they want to make a game, right? + +01:09:10 Sure. + +01:09:10 And when they want to make a game- + +01:09:12 And I can see a world, like look at one of the biggest game companies + +01:09:14 in the world, it's Unity. + +01:09:16 Yeah. + +01:09:16 Right? + +01:09:17 As a foundational, like building your game with not creator of games. + +01:09:21 They're built, I believe in C-sharp and.net, if I remember correctly. + +01:09:24 And that's a faster language, but it's not raging fast. + +01:09:28 You know, it's a lot faster, but it's still a decent way from a C, + +01:09:33 like a pure C language, right? + +01:09:35 Yeah. + +01:09:35 A pure C implementation. + +01:09:37 And they're really, really successful. + +01:09:39 They got close enough. + +01:09:41 I could easily see some company go, we're going to build a game engine. + +01:09:45 We're going to use Python as the language to get as many people + +01:09:48 who have been left out in the cold, in a sense, to do it. + +01:09:52 but we're going to do a mojo-like thing. + +01:09:54 Or we're going to do something where you don't get to use every library, + +01:09:57 but do you really need Flask in your game? + +01:10:00 Not in your game. + +01:10:01 You know what I mean? + +01:10:01 Yeah, yeah, yeah. + +01:10:02 We're going to build like a smaller focused, high performance version + +01:10:06 that looks as close as it could be. + +01:10:09 And we're going to sell you a game engine in a way to ship those games on Steam, + +01:10:13 a way to ship those games on Metal to macOS, et cetera. + +01:10:16 Right? + +01:10:16 Like I could see that world happen. + +01:10:18 Yeah, yeah. + +01:10:19 I don't see me making that world, but I could see that happen. + +01:10:21 And then actually, I think it would be okay. + +01:10:23 How do you feel? + +01:10:24 Like, do you see that as possible? + +01:10:25 No, that makes total sense. + +01:10:26 I think another thing we think about is like, how much are skills becoming more transferable + +01:10:32 because of LLMs? + +01:10:33 So can somebody who already learned Python well now quickly pick up C# in Unity + +01:10:39 because the LLM is doing a lot of the detailed syntax for them + +01:10:43 and they just have to understand the programmatic ideas. + +01:10:46 So is like, you know, maybe Python will continue to always be this great first language for everybody. + +01:10:52 And it'll be easier for people to now transition to other languages for specific domains. + +01:10:57 And so it matters less that we have everything in the original language. + +01:11:01 Okay, out in the chat, there's the recommendation for, that's GoDot. + +01:11:07 If you want to go, I'm not super familiar with it. + +01:11:09 I think it's an open source game engine. + +01:11:12 Yes, that much I know, but that's where my knowledge stops. + +01:11:15 Right, and I think it has like, it's like a Godot script or something or G script or something is it's it's language which I think + +01:11:21 has more of a Python like syntax somebody in the chat can correct me yeah interesting okay but I + +01:11:26 I think if you took an end-to-end thing you know it's not just a matter of like you can run the + +01:11:31 game engine with this you've got to take it all the way to here's how you ship your games because + +01:11:37 soon what is the very first thing you want to do once you get your game working and fun you want + +01:11:40 to show your friends you know what I mean and you've got to find a way to send it out so this + +01:11:45 goes back to what you were talking about earlier is all the hurdles around you know getting something + +01:11:50 on the mac app store steam or you know or whatever um and how and that goes also to another big thing + +01:11:56 that we're trying to solve in the python ecosystem which is the packaging story right um which there + +01:12:01 are many solutions for but just not one decided on let's make this the standard like you know as + +01:12:06 easy as possible thing yeah it's getting better things are definitely getting better but it's still + +01:12:11 there is no Python build --format equals EXE. + +01:12:16 Right. + +01:12:16 You know, where, where, what comes out and then some sort of tooling that automatically + +01:12:21 signs that stuff with your certificate you've got as a windows developer. + +01:12:25 So it doesn't get flags as malware. + +01:12:27 There's just, even if you get the thing to build, there's like these three or four other + +01:12:31 steps, you know, I'm thinking about an iOS. + +01:12:33 It's like, okay, we got it to run Python, but, but really what we needed to do is like + +01:12:37 integrate with Swift UI and have storyboards where I can like, I can weave it. And you're like, + +01:12:43 well, that's no, that's a long ways away. Like I know, but that's, you got to go through those + +01:12:48 stages to get it. I don't know. It's just, there's, there's a little bit, a little bit further to go, + +01:12:52 I guess, but I would love to see it. And I'm, I think it'll happen probably. + +01:12:55 It's not that you can't do it. It's just, there's too much friction right now. + +01:12:58 Yeah. Yeah. Yeah. Well, hopefully you've given some people some ideas. Somebody's going to go + +01:13:02 start the unity the pie unity company or whatever and let's let's let's make it happen + +01:13:09 yeah yeah oh and also just um real-time follow-up out there godo is apparently the pronunciation + +01:13:16 it's french sorry okay no no i didn't know it either um i think honestly just shout out to + +01:13:20 everyone if you have a weird name for your project on github put an mp3 and say this is how you say + +01:13:25 it just get it let's let's help us out and not that french is weird but just you know we should + +01:13:31 actually do that with english projects too right um yeah yeah absolutely to make them accessible to + +01:13:36 the yeah as simple yeah it has nothing to do with french i'm thinking even as something simple as + +01:13:41 g unicorn is often said gunicorn and half the people out there are probably thinking michael + +01:13:47 you're wrong it's gunicorn not unicorn but their logo is a green unicorn yeah i'm like well it's + +01:13:53 probably the g stand just g and then certainly the unicorn part is probably unicorn because + +01:13:57 unicorn is you know what i mean but like it's totally reasonable to look at it and go unicorn + +01:14:03 you know what i mean but if they just put a little mp3 like it's or even just where it's pronounced + +01:14:08 g ee dash unicorn you know what i mean yeah yeah i think we all need to do that yeah yeah all right + +01:14:14 well the chat is getting lively but we're gonna have to call it because we might be just a tiny + +01:14:19 bit over time and it's getting late and you're part of the world so well thank you so much for + +01:14:23 having me on again, Michael. It was really a pleasure. Congratulations again on Talk Python + +01:14:28 in production. Thank you. And Computer Science from Scratch as well. + +01:14:32 Yeah. And if folks want to check out the book, there's a website that + +01:14:36 I'm sure you'll put in the show notes, computersciencefromscratch.com. + +01:14:40 And you can just learn more about the different projects we do and some of the + +01:14:45 ideas in the book. Yeah. Awesome. People want to get in touch with you + +01:14:48 otherwise? How do they do that? Sure. You can find me on X, I guess, + +01:14:52 at Dave Kopech, D-A-V-E-K-O-P-E-C. + +01:14:55 And if you go to my website, DaveKopech.com, there's a bunch, there's email + +01:14:59 and a bunch of other ways to contact me. + +01:15:01 So D-A-V-E-K-O-P-E-C.com. + +01:15:03 Yeah, I'll put it in the show notes. + +01:15:04 Thanks, Michael. + +01:15:05 Yeah, you bet, you bet. + +01:15:06 So David, thanks for being back on the show. + +01:15:09 It's been really fun. + +01:15:09 Yeah, we'll talk to you later. + +01:15:10 Awesome, thanks, Michael. + +01:15:11 See ya. + +01:15:13 This has been another episode of Talk Python To Me. + +01:15:15 Thank you to our sponsors. + +01:15:16 Be sure to check out what they're offering. + +01:15:18 It really helps support the show. + +01:15:20 This episode is brought to you by Sentry. + +01:15:22 Don't let those errors go unnoticed. + +01:15:23 Use Sentry like we do here at Talk Python. + +01:15:25 Sign up at talkpython.fm/sentry. + +01:15:29 And it's brought to you by NordStellar. + +01:15:32 NordStellar is a threat exposure management platform from the Nord security family, + +01:15:36 the folks behind NordVPN, that combines dark web intelligence, + +01:15:40 session hijacking prevention, brand and domain abuse detection, + +01:15:45 and external attack surface management. + +01:15:47 Learn more and get started keeping your team safe at talkpython.fm/nordstellar. + +01:15:53 If you or your team needs to learn Python, we have over 270 hours of beginner and advanced courses + +01:15:58 on topics ranging from complete beginners to async code, Flask, Django, HTML, and even LLMs. + +01:16:05 Best of all, there's no subscription in sight. + +01:16:08 Browse the catalog at talkpython.fm. + +01:16:10 And if you're not already subscribed to the show on your favorite podcast player, + +01:16:14 what are you waiting for? + +01:16:16 Just search for Python in your podcast player. + +01:16:18 We should be right at the top. + +01:16:19 If you enjoyed that geeky rap song, you can download the full track. + +01:16:22 The link is actually in your podcast blog or share notes. + +01:16:25 This is your host, Michael Kennedy. + +01:16:26 Thank you so much for listening. + +01:16:28 I really appreciate it. + +01:16:29 I'll see you next time. + +01:16:42 And we ready to roll Upgrading the code No fear of getting old We tapped into that modern vibe + +01:16:53 Overcame each storm Talk Python To Me I sync is the norm + diff --git a/transcripts/529-python-apps-with-llm-building-blocks.vtt b/transcripts/529-python-apps-with-llm-building-blocks.vtt new file mode 100644 index 0000000..86e6055 --- /dev/null +++ b/transcripts/529-python-apps-with-llm-building-blocks.vtt @@ -0,0 +1,3872 @@ +WEBVTT + +00:00:00.020 --> 00:00:04.000 +A lot of people building software today never took the traditional computer science path. + +00:00:04.500 --> 00:00:09.420 +They arrived through curiosity, or a job that needed automating, or a late-night itch that + +00:00:09.540 --> 00:00:14.280 +made something work. This week, David Kopech joins me to talk about computer science for + +00:00:14.560 --> 00:00:18.420 +exactly these folks, the ones who learned to program first and are now ready to understand + +00:00:18.520 --> 00:00:23.580 +the deeper ideas that power the tools they use every day. This is Talk Python To Me, + +00:00:23.740 --> 00:00:27.540 +episode 529, recorded October 26, 2025. + +00:00:44.860 --> 00:00:49.180 +Welcome to Talk Python To Me, the number one Python podcast for developers and data scientists. + +00:00:49.760 --> 00:00:55.000 +This is your host, Michael Kennedy. I'm a PSF fellow who's been coding for over 25 years. + +00:00:55.560 --> 00:00:56.700 +Let's connect on social media. + +00:00:57.180 --> 00:01:00.240 +You'll find me and Talk Python on Mastodon, Bluesky, and X. + +00:01:00.520 --> 00:01:02.360 +The social links are all in your show notes. + +00:01:03.120 --> 00:01:06.640 +You can find over 10 years of past episodes at talkpython.fm. + +00:01:06.820 --> 00:01:10.000 +And if you want to be part of the show, you can join our recording live streams. + +00:01:10.380 --> 00:01:10.840 +That's right. + +00:01:11.050 --> 00:01:14.300 +We live stream the raw uncut version of each episode on YouTube. + +00:01:14.880 --> 00:01:19.300 +Just visit talkpython.fm/youtube to see the schedule of upcoming events. + +00:01:19.560 --> 00:01:23.180 +Be sure to subscribe there and press the bell so you'll get notified anytime we're recording. + +00:01:23.820 --> 00:01:25.720 +This episode is brought to you by Sentry. + +00:01:26.040 --> 00:01:27.280 +Don't let those errors go unnoticed. + +00:01:27.560 --> 00:01:29.080 +Use Sentry like we do here at Talk Python. + +00:01:29.580 --> 00:01:32.440 +Sign up at talkpython.fm/sentry. + +00:01:33.000 --> 00:01:34.980 +And it's brought to you by NordStellar. + +00:01:35.539 --> 00:01:38.020 +NordStellar is a threat exposure management platform + +00:01:38.600 --> 00:01:39.840 +from the Nord security family, + +00:01:40.240 --> 00:01:41.260 +the folks behind NordVPN, + +00:01:41.800 --> 00:01:43.440 +that combines dark web intelligence, + +00:01:44.080 --> 00:01:45.600 +session hijacking prevention, + +00:01:46.200 --> 00:01:48.200 +brand and domain abuse detection, + +00:01:48.520 --> 00:01:50.280 +and external attack surface management. + +00:01:50.840 --> 00:01:53.000 +Learn more and get started keeping your team safe + +00:01:53.020 --> 00:01:55.800 +at talkpython.fm/Nordstellar. + +00:01:56.520 --> 00:01:58.100 +David, welcome back to Talk Bythonomy. + +00:01:58.410 --> 00:01:59.980 +Thank you so much for having me back, Michael. + +00:02:00.140 --> 00:02:00.680 +It's really an honor. + +00:02:01.000 --> 00:02:04.220 +Got some really fun stuff to talk about, + +00:02:04.360 --> 00:02:05.960 +computer science from scratch. + +00:02:06.320 --> 00:02:07.140 +What does that even mean? + +00:02:07.460 --> 00:02:08.320 +We're going to find out + +00:02:08.869 --> 00:02:10.340 +because you wrote the book on it. + +00:02:10.500 --> 00:02:11.300 +Yeah, I'm excited. + +00:02:11.640 --> 00:02:13.020 +And this book just came out, + +00:02:13.130 --> 00:02:14.700 +so it's kind of fresh off the press. + +00:02:14.830 --> 00:02:15.880 +And I want to thank you + +00:02:15.930 --> 00:02:17.140 +for being the technical reviewer + +00:02:17.140 --> 00:02:17.860 +on the book, actually. + +00:02:18.320 --> 00:02:19.020 +Yeah, it was really fun. + +00:02:19.020 --> 00:02:19.920 +You reached out and asked me, + +00:02:20.070 --> 00:02:21.880 +and I don't normally do things like that, + +00:02:21.890 --> 00:02:22.520 +but I'm like, you know, + +00:02:22.660 --> 00:02:26.500 +that'd be kind of fun. It would be a good experience. And I do think it was. It taught + +00:02:26.600 --> 00:02:31.140 +me some about creating books and I guess that pays off as well. And congratulations, by the way, + +00:02:31.360 --> 00:02:36.340 +on Talk Python in production coming out. Yeah, thanks. I started that last December and it came + +00:02:36.480 --> 00:02:41.340 +out, I think the beginning of this month, maybe end of September, something like that, but pretty + +00:02:41.600 --> 00:02:45.640 +recently. So yeah, I really appreciate it. It's going really well. We're in kind of the same phase + +00:02:45.780 --> 00:02:50.340 +because computer science from scratch officially came out end of September also. So we're just in + +00:02:50.360 --> 00:02:55.060 +this kind of, you know, one month out from these books coming out. And so it's both, I think it's + +00:02:55.060 --> 00:02:59.080 +an exciting time for both of us. And I've started reading Talk Python in production, and I'm really + +00:02:59.240 --> 00:03:03.400 +loving it. So congrats on it. Awesome. Thank you. And I really enjoyed your book. And I had to read + +00:03:03.420 --> 00:03:08.080 +it with a little more detail than just appreciate it, right? I had to like take notes and help you + +00:03:08.100 --> 00:03:12.380 +with some feedback. So that was a really cool experience. And thanks. You know, it's been six, + +00:03:12.640 --> 00:03:16.400 +almost a little, it's over five years, let's put it that way. It's been over five years since you've + +00:03:16.420 --> 00:03:21.720 +on the show, we talked about classic computer science programs in Python, I believe it was. + +00:03:22.510 --> 00:03:26.820 +And yeah, that was really a fun and popular episode. But let's do a quick refresher, + +00:03:26.830 --> 00:03:30.320 +a quick introduction for you to everyone who's new to the show in the last five years. + +00:03:30.740 --> 00:03:35.120 +Okay. So I'm a computer. My name is David Kopech. I'm a computer science professor + +00:03:35.520 --> 00:03:40.740 +at Albright College in Redding, Pennsylvania. Actually just moved over here from Champlain + +00:03:40.990 --> 00:03:45.620 +College, where I was for the past nine years in a similar role. And I'm also the program director + +00:03:45.640 --> 00:03:49.580 +of Computer Science and Information Technology. We're launching three new majors, a little plug + +00:03:49.700 --> 00:03:55.100 +for Albright, a revamped computer science major, an artificial intelligence major, and cybersecurity. + +00:03:55.760 --> 00:04:01.820 +And so I'm managing the launch of those for fall 2026. My background is for the past decade, + +00:04:02.120 --> 00:04:06.580 +obviously, as a computer science educator. I've written five books on computer science and + +00:04:06.880 --> 00:04:10.680 +programming, the most successful of which was Classic Computer Science, Problems in Python, + +00:04:11.100 --> 00:04:16.780 +which of course I was on the show five years ago about. And this book, Computer Science from Scratch, + +00:04:17.100 --> 00:04:22.400 +is kind of for a similar audience. Both books are for Python programmers, folks who are intermediate + +00:04:22.820 --> 00:04:28.060 +or advanced Python programmers, but maybe who want to fill in the kind of computer science topics + +00:04:28.340 --> 00:04:31.720 +that they don't know, maybe because they're self-educated. Maybe they learned Python on + +00:04:31.730 --> 00:04:36.000 +their own. They didn't have a formal CS degree, or maybe they did have a formal CS degree, + +00:04:36.080 --> 00:04:41.040 +but now they're preparing for interviews or transitioning to a more CS in-depth role. + +00:04:41.370 --> 00:04:43.100 +And they want to refresh on some of these topics. + +00:04:43.720 --> 00:04:48.220 +So both books are for folks who know Python, but want to go deeper on CS. + +00:04:48.760 --> 00:04:53.020 +Right. They're not, here's how you do a loop in Python, or here's what InterTools is. + +00:04:53.300 --> 00:04:56.420 +It's true computer science topics, right? + +00:04:56.720 --> 00:05:00.320 +Right. And I think somebody at that stage will be very frustrated by the books. + +00:05:00.690 --> 00:05:03.100 +They really are. And so we have to put that preface in. + +00:05:03.140 --> 00:05:06.060 +And they really are for intermediate or advanced Python programmers. + +00:05:07.020 --> 00:05:12.160 +So, you know, so I'm trying to reach the same audience, but they're totally different books + +00:05:12.220 --> 00:05:13.860 +in terms of the type of topics they cover. + +00:05:14.300 --> 00:05:18.380 +So classic computer science problems in Python was more of a data structures and algorithms + +00:05:18.520 --> 00:05:23.880 +type of book, did some AI type topics as well, but very much on the algorithm side. + +00:05:24.240 --> 00:05:28.979 +Computer science from scratch is more, let's build up the layers of the software stack from + +00:05:29.000 --> 00:05:33.960 +the bottom up that Python kind of runs on top of. So you understand what's actually happening under + +00:05:33.980 --> 00:05:39.460 +the hood. So how does an interpreter work? What's the hardware software interface like? We get that + +00:05:39.600 --> 00:05:44.160 +with emulators. And then there is still some algorithmic stuff as well, different topics than + +00:05:44.300 --> 00:05:49.580 +in the prior book, but stuff like a little computer art stuff, a little bit of machine learning. So + +00:05:49.900 --> 00:05:54.000 +we try, it's still a survey book. It's still pretty broad, but it's more about the layers of + +00:05:54.000 --> 00:05:58.180 +the software stack than classic computer science problems in Python was. Yeah. And there's some + +00:05:58.120 --> 00:06:02.240 +pretty low-level stuff and it's interesting you're doing in python you know a lot of i think a lot of + +00:06:02.340 --> 00:06:09.180 +times that would be taught in um at least a pointer based language you know like c or even c c being + +00:06:09.280 --> 00:06:13.680 +taught in assembly like you're talking to the machine like literally absolutely and you know + +00:06:13.710 --> 00:06:19.160 +if you took these classes like if you took an um architecture class which i'm going to be teaching + +00:06:19.260 --> 00:06:23.779 +next semester at albright it's going to be an assembly language right and if you took a compilers + +00:06:23.800 --> 00:06:29.240 +course, it's usually in something like C or today it might be in a Rust, a low level language where + +00:06:29.270 --> 00:06:34.860 +you do have that direct access to memory. But I wanted to make the book accessible. And Python is + +00:06:34.940 --> 00:06:38.360 +the most accessible language in the world. It's the most popular language in the world. And it's + +00:06:38.360 --> 00:06:42.540 +the language that we're increasingly using in computer science education anyway. So it is + +00:06:42.650 --> 00:06:48.339 +actually a little unusual to cover some of these topics in Python, but it makes the topics accessible + +00:06:48.360 --> 00:06:49.700 +to a much wider audience. + +00:06:50.200 --> 00:06:52.660 +It definitely makes it accessible to a wider audience. + +00:06:53.540 --> 00:06:56.960 +Let's take a step back and talk about this role you have + +00:06:56.990 --> 00:06:59.760 +in this new college, university you're at. + +00:06:59.960 --> 00:07:01.980 +You said that you're revamping the CS program. + +00:07:02.420 --> 00:07:04.120 +You're adding AI and cybersecurity. + +00:07:04.760 --> 00:07:05.600 +That's a big shift. + +00:07:05.720 --> 00:07:08.440 +Do you want to just talk to people about how that's changing, + +00:07:08.840 --> 00:07:10.200 +why you decided to change? + +00:07:10.760 --> 00:07:12.520 +I look at a lot of CS programs, + +00:07:12.940 --> 00:07:15.540 +and I don't know how connected they are to the real world. + +00:07:15.800 --> 00:07:20.680 +So imagine some of these changes are to sort of realign it with what's happened recently. + +00:07:21.000 --> 00:07:21.220 +Absolutely. + +00:07:21.620 --> 00:07:21.720 +Yeah. + +00:07:22.180 --> 00:07:26.880 +So I'm coming from a college, the Champlain College in Vermont, where I was at for nine + +00:07:27.080 --> 00:07:29.300 +years, which was a professionally focused college. + +00:07:29.710 --> 00:07:33.000 +So it was a college where we were preparing students for their careers. + +00:07:33.600 --> 00:07:36.200 +I moved now to Albright, which is a liberal arts college. + +00:07:36.310 --> 00:07:40.040 +So it's a college that's more teaching people how to think, creating great citizens. + +00:07:40.640 --> 00:07:42.980 +And hopefully we prepare them for their careers very well, too. + +00:07:43.260 --> 00:07:45.780 +but there's that liberal arts foundation underneath everything. + +00:07:46.340 --> 00:07:49.440 +So one of the challenges in developing these three new majors + +00:07:50.000 --> 00:07:52.260 +was how do you fit them into a liberal arts curriculum? + +00:07:52.780 --> 00:07:54.420 +And how do you make them relevant to careers + +00:07:54.720 --> 00:07:56.280 +while still being true to the liberal arts? + +00:07:56.620 --> 00:07:58.580 +And so what I found the best way to do it + +00:07:58.720 --> 00:08:02.700 +is to still focus on how to help people think computationally. + +00:08:03.020 --> 00:08:05.800 +And so there's a firm foundation of computer science + +00:08:05.940 --> 00:08:07.940 +and mathematics throughout all three majors. + +00:08:08.380 --> 00:08:09.540 +So whether you're doing cybersecurity, + +00:08:10.040 --> 00:08:11.920 +artificial intelligence, or computer science, + +00:08:12.440 --> 00:08:16.780 +you're going to have a lot of classes in common with one another. And we also need to infuse some + +00:08:16.780 --> 00:08:21.200 +of the issues we see in the workforce today. Like all of them will have a computer ethics course + +00:08:21.680 --> 00:08:25.980 +that's incorporated. And all of them will have an internship course that's incorporated. So + +00:08:26.160 --> 00:08:29.340 +students are getting real world experience as they're going through these majors. + +00:08:29.960 --> 00:08:34.960 +I think that's a great idea. It's really important to have done software engineering + +00:08:35.700 --> 00:08:39.020 +hands-on with other companies, real products, and real product managers, + +00:08:39.380 --> 00:08:42.219 +in addition to just knowing the algorithms and foundations. + +00:08:42.740 --> 00:08:48.300 +Absolutely. I couldn't agree more. And so we're trying to blend the two. And that's been one of + +00:08:48.300 --> 00:08:52.280 +the exciting things. And you asked why I changed. I was excited about this opportunity to build these + +00:08:52.310 --> 00:08:57.260 +new majors from the ground up. We'll be one of the first teaching colleges, small teaching colleges, + +00:08:57.350 --> 00:09:01.900 +to offer an artificial intelligence major. It's just been in the last five years that we've seen + +00:09:02.160 --> 00:09:06.679 +colleges start to offer an undergraduate major in artificial intelligence. And it's mostly been + +00:09:06.700 --> 00:09:14.140 +big name brand colleges like CMU that are offering these first artificial intelligence bachelor's + +00:09:14.180 --> 00:09:19.200 +degrees. But we get to be at the forefront of how do you fit that into a small liberal arts college + +00:09:19.640 --> 00:09:23.140 +and still make it career relevant and still have that firm liberal arts foundation. + +00:09:23.680 --> 00:09:28.300 +Are you concerned about letting all these hackers in on your network if you're teaching a cybersecurity? + +00:09:28.820 --> 00:09:32.540 +Well, I won't be teaching the cybersecurity courses. So I'm firmly just in the computer + +00:09:32.540 --> 00:09:36.260 +science education course, and we're hiring additional faculty in cybersecurity and in + +00:09:36.420 --> 00:09:41.720 +artificial intelligence. I'm more on the management side of those two programs. But, you know, look, + +00:09:41.840 --> 00:09:46.060 +it's more relevant than ever. When you look at actually the number of students going to college + +00:09:46.460 --> 00:09:51.840 +for cybersecurity, it's been exploding over the last decade. You know, I just talked about how + +00:09:52.040 --> 00:09:56.500 +artificial intelligence has just come about as a bachelor's degree. Cybersecurity was kind of there + +00:09:56.660 --> 00:10:01.699 +15 years ago. So cybersecurity started out as maybe you took a computer security course at the + +00:10:01.660 --> 00:10:05.480 +end of your computer science degree. Then it became a concentration you could do within some + +00:10:05.710 --> 00:10:10.280 +computer science degrees. And then in the 2010s, it broke out and became its own bachelor's degree. + +00:10:10.640 --> 00:10:14.460 +And so we're seeing the same thing right now with AI. It started out, you know, when you did + +00:10:14.960 --> 00:10:19.360 +undergrad computer science, maybe in the 90s, or like me and the OOs, you might have one course, + +00:10:19.510 --> 00:10:23.840 +like an intro to AI course that you have at the end of your bachelor's degree. Then it started to + +00:10:23.880 --> 00:10:27.359 +be something where you do a concentration at Champlain. I developed a concentration in + +00:10:27.380 --> 00:10:31.620 +artificial intelligence. Now, just in the last five years, we're seeing it break out from being + +00:10:31.710 --> 00:10:38.000 +just a part of a CS degree to being its whole own, you know, adjacent degree. So I think it's an + +00:10:38.180 --> 00:10:42.460 +exciting time for that. Of course, there's a hype cycle right now about AI. So a lot of colleges + +00:10:42.630 --> 00:10:47.820 +are jumping in and I think some of them are doing it the right way. And, you know, having that firm + +00:10:47.960 --> 00:10:51.980 +foundation in computer science and mathematics, so it's durable. And some of them are doing it in + +00:10:52.120 --> 00:10:56.579 +kind of a shallow way. We're trying to do it the right way. Yeah, that sounds great. And obviously + +00:10:56.600 --> 00:11:00.840 +cybersecurity is one of those things that's highly valuable. Nobody wants to be in the news + +00:11:01.480 --> 00:11:05.500 +for that reason, right? So companies are certainly looking for people with those skills. + +00:11:06.440 --> 00:11:12.620 +Let's talk one more thing about the university and not the AI focus, but just AI. One of the + +00:11:12.630 --> 00:11:19.220 +things I think is of all the places that's getting the most scrambled, changed, under pressure, + +00:11:19.480 --> 00:11:23.980 +whatever you call it, from AI, I think it's education in general. And I'm not talking just + +00:11:24.000 --> 00:11:28.000 +college. I'm talking, I don't know, like third grade. As soon as the kids can start using AI, + +00:11:28.220 --> 00:11:32.120 +they're like, this is like the calculator with a rocket booster on it. You know what I mean? Like + +00:11:32.200 --> 00:11:38.200 +this will solve the problems. And I think there's a really big challenge for you as universities to + +00:11:38.480 --> 00:11:43.700 +connect with students, keep academic integrity as well. But there's also a huge problem, I think, + +00:11:43.760 --> 00:11:49.500 +for the students to not let it undercut their education and end up going, well, all I know + +00:11:49.480 --> 00:11:54.560 +to do is to do ChatGPT. You're absolutely right. I mean, it's a huge challenge in computer science + +00:11:54.920 --> 00:12:00.060 +education. And I don't think the computer science education community has yet completely figured it + +00:12:00.240 --> 00:12:05.460 +out. Well, basically what it breaks down to is exactly what you said. Students are just using + +00:12:05.620 --> 00:12:12.060 +ChatGPT or GitHub Copilot to do their homework. And if you're in a first year or second year class, + +00:12:12.460 --> 00:12:17.459 +I've been mostly teaching upper level CS for the past like six plus years. And I'm teaching an + +00:12:17.440 --> 00:12:22.680 +intro class for the first time in a long time this semester. And it's been eye-opening for me + +00:12:22.960 --> 00:12:28.180 +to see how many students are trying to just do every assignment with ChatGPT. And we still have + +00:12:28.180 --> 00:12:32.840 +to give them basic assignments. When you're first learning how to do a for loop or what a function + +00:12:33.100 --> 00:12:36.660 +is or what an if statement is even, right? You got to write some of those things. + +00:12:36.660 --> 00:12:39.880 +You can't say implement a database and get back to me next week. You got to start somewhere. + +00:12:40.260 --> 00:12:45.439 +So we still have to teach these fundamentals, but we have this opponent to us in some sense + +00:12:45.440 --> 00:12:51.320 +of this ease of access to something that can just do all the work for you. And so I'm sure that, + +00:12:51.430 --> 00:12:55.640 +you know, like you mentioned, mathematics educators had similar challenges in the 1970s, + +00:12:55.860 --> 00:13:02.500 +1980s as calculators became prominent. So what I've done and I've been adjusting as since chat + +00:13:02.580 --> 00:13:07.940 +GPT came out in fall 2022, I've been constantly adjusting and reevaluating, but I have had to go + +00:13:08.040 --> 00:13:12.700 +back to the future a little bit and people might find this a little anachronistic, but I've heard + +00:13:12.660 --> 00:13:17.860 +about it from other CS educators as well, going back to doing some paper exams. I know that sounds + +00:13:18.300 --> 00:13:23.760 +crazy and I know that sounds like bizarre, but at some point you need to evaluate if students + +00:13:24.180 --> 00:13:29.180 +actually know how to write a for loop because, you know, while we think, some people think LLMs are + +00:13:29.180 --> 00:13:33.300 +going to replace software engineers in the next two years, you still need to understand what it's + +00:13:33.440 --> 00:13:36.260 +outputting. And I don't think it's, I don't know about you, Michael, but I don't think they're + +00:13:36.420 --> 00:13:41.220 +completely replacing software engineers in the next couple of years. And that's coming, I don't + +00:13:41.180 --> 00:13:46.900 +know how many people maybe don't necessarily. I'm a huge fan of agentic coding and what it can do + +00:13:47.000 --> 00:13:53.160 +for productivity. And it's incredibly powerful, but it's one of those things that it needs someone to + +00:13:53.300 --> 00:13:58.300 +guide it who knows how to do that. And then it becomes a superpower. If you don't, you end up + +00:13:58.300 --> 00:14:02.800 +with like, how do we end up on React? I thought this was a Rust project. You're like, what happened + +00:14:02.920 --> 00:14:06.619 +here? Yeah. And you need to understand when it makes mistakes, you need to know how to correct + +00:14:06.640 --> 00:14:11.880 +those mistakes. And of course, you need to be understanding everything that it's outputting, + +00:14:11.880 --> 00:14:18.120 +so you're auditing it. This portion of Talk Python To Me is brought to you by Sentry's Seer. + +00:14:18.820 --> 00:14:24.960 +I'm excited to share a new tool from Sentry, Seer. Seer is your AI-driven pair programmer that finds, + +00:14:25.160 --> 00:14:30.999 +diagnoses, and fixes code issues in your Python app faster than ever. If you're already using + +00:14:31.020 --> 00:14:37.240 +Sentry, you are already using Sentry, right? Then using Seer is as simple as enabling a feature on + +00:14:37.240 --> 00:14:42.600 +your already existing project. Seer taps into all the rich context Sentry has about an error. + +00:14:43.220 --> 00:14:48.580 +Stack traces, logs, commit history, performance data, essentially everything. Then it employs its + +00:14:48.880 --> 00:14:54.020 +agentic AI code capabilities to figure out what is wrong. It's like having a senior developer + +00:14:54.360 --> 00:15:00.260 +pair programming with you on bug fixes. Seer then proposes a solution, generating a patch for your + +00:15:00.180 --> 00:15:05.700 +code and even opening a GitHub pull request. This leaves the developers in charge because it's up to + +00:15:05.700 --> 00:15:11.780 +them to actually approve the PR. But it can reduce the time from error detection to fix dramatically. + +00:15:12.500 --> 00:15:17.120 +Developers who've tried it found it can fix errors in one shot that would have taken them + +00:15:17.430 --> 00:15:25.439 +hours to debug. SEER boasts a 94.5% accuracy in identifying root causes. SEER also prioritizes + +00:15:25.460 --> 00:15:31.860 +actionable issues with an actionability score, so you know what to fix first. This transforms + +00:15:32.240 --> 00:15:38.000 +Sentry errors into actionable fixes, turning a pile of error reports into an ordered to-do list. + +00:15:38.660 --> 00:15:43.800 +If you could use an always-on-call AI agent to help track down errors and propose fixes before + +00:15:43.900 --> 00:15:50.120 +you even have time to read the notification, check out Sentry's Seer. Just visit talkpython.fm + +00:15:50.260 --> 00:15:54.840 +slash Seer, S-E-E-R. The link is in your podcast player's show notes. + +00:15:55.360 --> 00:16:01.160 +be sure to use our code talkpython one word all caps thank you dysentery for supporting talk + +00:16:01.320 --> 00:16:08.160 +pythonomy so we are having a real challenge especially in those intro classes of how do you + +00:16:08.300 --> 00:16:13.240 +kind of force students to not use these tools essentially because you're not learning anything + +00:16:13.310 --> 00:16:18.600 +if the tool writes the for loop for you when you're first learning how to do a for loop and so you have + +00:16:18.600 --> 00:16:24.900 +to find ways to to encourage it to to win hearts and minds i think of course that's a big part of it + +00:16:24.920 --> 00:16:30.940 +is convincing students and being dynamic and enthusiastic about how good it feels to really + +00:16:31.140 --> 00:16:36.020 +understand how this actually works. But then there has to be enforcement too. And sometimes that feels + +00:16:36.050 --> 00:16:41.560 +a little anachronistic being forceful about it or going back to doing tests on paper. But we have to + +00:16:41.700 --> 00:16:46.920 +have ways of ensuring the knowledge is actually there. I feel like this is not just a college + +00:16:47.140 --> 00:16:52.860 +student issue, but I think it's especially relevant in that part of your career that the struggle is + +00:16:52.880 --> 00:16:59.380 +not in the way. The struggle is often part of what unlocks your thinking. It's part of what + +00:16:59.600 --> 00:17:03.500 +cements the knowledge and makes you feel a true sense of accomplishment. When you're like, + +00:17:03.720 --> 00:17:08.920 +I tried this and I couldn't get it to work. But three hours later, I finally figured it out. And + +00:17:09.020 --> 00:17:15.100 +I now understand iterators. Finally. You know what I mean? And it's just so easy to push the easy + +00:17:15.449 --> 00:17:21.380 +button and just say, chat, why? Yeah. You know? And that's what feels good about being a teacher + +00:17:21.400 --> 00:17:24.380 +is being there for those aha moments with students. + +00:17:24.730 --> 00:17:26.319 +I had some moments like that last week + +00:17:26.430 --> 00:17:27.480 +and they reminded me, + +00:17:27.959 --> 00:17:29.420 +this is why you're in this career. + +00:17:31.280 --> 00:17:34.460 +It's really something that can become addictive + +00:17:34.860 --> 00:17:35.600 +to students actually. + +00:17:35.810 --> 00:17:37.160 +When they start having those aha moments, + +00:17:37.320 --> 00:17:39.840 +they want more of them and it spurs on. + +00:17:40.300 --> 00:17:41.940 +That's how you end up at like 3 a.m. + +00:17:42.440 --> 00:17:43.000 +really hungry, + +00:17:43.790 --> 00:17:44.860 +wondering why you haven't gone to sleep + +00:17:45.000 --> 00:17:45.640 +but still programming. + +00:17:45.810 --> 00:17:47.660 +You're like, this is amazing, I can't stop. + +00:17:48.140 --> 00:17:48.700 +Right, right, right. + +00:17:48.840 --> 00:17:53.420 +And it takes a certain mindset to be able to appreciate those moments. + +00:17:54.500 --> 00:18:02.760 +And, you know, this is like a sidebar, but one thing we're also seeing in computer science education is we see a lot of folks who go into it sometimes for the wrong reasons. + +00:18:03.560 --> 00:18:07.760 +Folks sometimes go into computer science just because they hear, this is a great way to get a good job. + +00:18:08.560 --> 00:18:15.240 +And if that's your only motivation going into it, you're probably not going to be successful in it, unfortunately. + +00:18:15.760 --> 00:18:17.400 +It's going to be tough. Yeah, it's going to be tough. + +00:18:17.700 --> 00:18:20.900 +But a lot of people are there for the right reasons. So I think that that's good. + +00:18:21.160 --> 00:18:25.520 +Absolutely. And I hope with a book like this, folks who came in from the other side, + +00:18:25.900 --> 00:18:31.780 +folks who came in because they had that interest, but they didn't have the chance to either go to + +00:18:31.900 --> 00:18:35.220 +university, maybe they couldn't afford it, maybe they studied something else and they're later in + +00:18:35.300 --> 00:18:41.680 +their career. This will hopefully give them a bunch of those aha moments and about topics that + +00:18:41.680 --> 00:18:46.720 +are deeper than just how to write a for loop. Yeah, absolutely. Well, good. I mean, I feel like + +00:18:46.740 --> 00:18:51.000 +there's probably some middle ground you might be able to accomplish if you're like, all right, + +00:18:51.320 --> 00:18:56.440 +we're going to give you a laptop, a testing laptop that has no internet. Like we've gone in there and + +00:18:56.720 --> 00:19:01.240 +crushed the network card. Yeah. Here, take the test. And here's your thumb drive to submit your + +00:19:01.400 --> 00:19:04.660 +work and potentially, you know, potentially. Yeah. Yeah. And when we do stuff like that, + +00:19:04.780 --> 00:19:10.320 +like I just did a test on, you know, an online test for my students and I'm just monitoring to + +00:19:10.400 --> 00:19:15.300 +make sure nobody's using the tools. You know, but there is something said still for paper exams. + +00:19:15.380 --> 00:19:19.660 +I'll tell you why. Sometimes even today on today's tools for giving exams, sometimes you want students + +00:19:19.710 --> 00:19:23.520 +to draw something, especially if it's a data structures and algorithms class, you might want + +00:19:23.550 --> 00:19:27.740 +them to draw a tree and it's just actually easier for them to do that on paper. So when people hear + +00:19:27.860 --> 00:19:31.320 +paper, they might be like, oh my goodness, what are you doing that for? No, there's real reasons. + +00:19:32.280 --> 00:19:37.220 +Yeah, of course. But yeah, we have monitoring tools too. Yeah, very good. All right. Let's maybe + +00:19:37.680 --> 00:19:41.480 +take a couple of different examples or different chapters in the book and talk through them. + +00:19:41.960 --> 00:19:47.860 +The first one, the first main topic is the smallest possible programming language, right? + +00:19:48.240 --> 00:19:49.040 +Yes, yes. + +00:19:49.660 --> 00:19:50.240 +Tell us about this. + +00:19:50.560 --> 00:19:55.860 +Yeah, the premise of the chapter is what's the minimum that we need to have a programming language? + +00:19:56.380 --> 00:19:57.620 +And there's a famous programming language. + +00:19:57.760 --> 00:20:02.560 +I'm actually not going to use the name of the language on the show, I think, just because it has the F word in it. + +00:20:02.620 --> 00:20:03.960 +And I didn't make up the name of the language. + +00:20:04.520 --> 00:20:07.540 +It was developed 30 years ago, but we'll just call it BF, okay? + +00:20:07.800 --> 00:20:07.880 +Yeah. + +00:20:09.160 --> 00:20:10.160 +Brain F star. + +00:20:10.620 --> 00:20:11.300 +Yeah, sure. + +00:20:11.620 --> 00:20:17.240 +or in AppStar. This language, it only has eight symbols in it. I mean, it literally only has eight + +00:20:17.360 --> 00:20:22.000 +symbols in it, yet it's what we call Turing complete. And I'm not going to, I won't go into + +00:20:22.000 --> 00:20:25.300 +the full details of what it means for something to be Turing complete, but let me put it this way. + +00:20:25.540 --> 00:20:30.740 +A language that is Turing complete can theoretically solve any of the same algorithmic + +00:20:31.020 --> 00:20:35.000 +problems as any other language that's Turing complete. And every programming language that + +00:20:35.000 --> 00:20:39.159 +you use is Turing complete, whether it's Java, Python, C, whatever, of course, they're all + +00:20:39.100 --> 00:20:44.120 +Turing complete. This language with only eight symbols in it is also Turing complete. So while + +00:20:44.150 --> 00:20:49.820 +you could code something like Quicksort in Python, you could also code it in BF with just eight + +00:20:50.000 --> 00:20:55.920 +symbols. While you could code a JSON parser in Python, you could also code a JSON parser in BF. + +00:20:56.260 --> 00:21:00.740 +So by learning this really, really basic language and actually implementing it, + +00:21:00.750 --> 00:21:04.500 +so implementing an interpreter for it, something that can actually run programs written in it, + +00:21:04.540 --> 00:21:09.120 +you really get to understand just how little we need to solve computational problems. + +00:21:09.340 --> 00:21:10.180 +We don't need much. + +00:21:10.960 --> 00:21:19.800 +We need very minimum amount of computing machinery and very minimum amount of syntax to be able to solve most problems. + +00:21:20.320 --> 00:21:22.220 +Now, would you want to solve most problems in PF? + +00:21:22.340 --> 00:21:22.860 +Of course not. + +00:21:23.320 --> 00:21:28.940 +We just use it as an illustrative example to show this is how simple a programming language can really be + +00:21:29.400 --> 00:21:33.320 +and still have all the same capabilities as a more advanced programming language. + +00:21:33.700 --> 00:21:36.800 +And you might wonder then why do we have such much more advanced programming languages? + +00:21:37.300 --> 00:21:39.020 +Because they give us a lot more developer productivity. + +00:21:39.200 --> 00:21:43.380 +They have more abstractions that let us think as human beings instead of thinking like machines. + +00:21:43.800 --> 00:21:45.080 +The expressiveness, absolutely. + +00:21:45.880 --> 00:21:48.440 +And so that's great that we have all of that. + +00:21:48.800 --> 00:21:54.600 +But if you really want to understand essentially what the language is doing at the lowest levels, you only need these few bits. + +00:21:55.040 --> 00:21:56.980 +So what's the key message here? + +00:21:57.160 --> 00:22:00.480 +Obviously, you're not trying to get people to become BF experts, right? + +00:22:00.660 --> 00:22:04.020 +I mean, maybe learn Cobalt over BF these days. + +00:22:05.200 --> 00:22:06.840 +It could work for the Social Security Administration. + +00:22:07.340 --> 00:22:08.120 +Yes, exactly. + +00:22:08.300 --> 00:22:10.360 +There's going to be some high-paying Cobalt jobs out there. + +00:22:11.400 --> 00:22:18.760 +But this is more about writing an actual interpreter, very much like CPython itself, in a sense, conceptually. + +00:22:19.620 --> 00:22:20.060 +Conceptually. + +00:22:20.250 --> 00:22:22.180 +I mean, much, much, much simpler. + +00:22:22.440 --> 00:22:28.020 +And actually, in the next chapter, we get to writing a basic interpreter, which is just one step up from BF. + +00:22:28.510 --> 00:22:29.760 +And we can talk about that in a minute. + +00:22:29.820 --> 00:22:34.120 +But yeah, it's about understanding how it works at a low level. + +00:22:34.350 --> 00:22:35.480 +Like what's it actually doing? + +00:22:35.620 --> 00:22:38.440 +And, you know, it feels good to make something that can run programs. + +00:22:39.800 --> 00:22:44.420 +A lot of people, when they get into computer science, are actually excited about like making their own language. + +00:22:45.020 --> 00:22:48.700 +So by doing these first couple of chapters of the book, you're actually on the path to that. + +00:22:48.750 --> 00:22:57.400 +I think after reading these first two chapters, you could go implement your own simple language and really make that kind of dream that a lot of us have come true. + +00:22:57.860 --> 00:22:59.500 +So it's about understanding how it works at a low level. + +00:22:59.720 --> 00:23:02.240 +I have to say the book is not a practical book. + +00:23:02.360 --> 00:23:04.460 +It's not like talk Python in production + +00:23:05.020 --> 00:23:05.820 +where you're going to learn, + +00:23:06.070 --> 00:23:07.820 +you know, here's some useful tools + +00:23:08.020 --> 00:23:09.240 +and tips and strategies. + +00:23:09.610 --> 00:23:11.240 +Install this thing and set up that config + +00:23:11.410 --> 00:23:12.100 +and then run it this way. + +00:23:12.100 --> 00:23:12.900 +To do something right now. + +00:23:13.480 --> 00:23:13.560 +Right. + +00:23:13.920 --> 00:23:14.720 +Computer Science from Scratch + +00:23:14.720 --> 00:23:16.700 +is not going to help you build your next app + +00:23:16.940 --> 00:23:17.960 +like that you're building. + +00:23:18.680 --> 00:23:19.240 +Not directly. + +00:23:19.660 --> 00:23:20.260 +Not directly, + +00:23:20.520 --> 00:23:21.960 +but it is going to help you understand + +00:23:22.110 --> 00:23:24.060 +a lot more about what's happening + +00:23:24.190 --> 00:23:24.840 +under the covers, + +00:23:25.370 --> 00:23:27.100 +which is ultimately going to make you think + +00:23:27.360 --> 00:23:29.139 +more broadly as a software developer + +00:23:29.140 --> 00:23:32.500 +and understand different strategies that you might be able to use. + +00:23:32.500 --> 00:23:34.400 +I'll give you an example with this interpreter stuff. + +00:23:35.100 --> 00:23:39.080 +You might be writing a program that needs to have some kind of configuration files, + +00:23:39.620 --> 00:23:43.040 +and you want to maybe be able to parse those configuration files. + +00:23:43.520 --> 00:23:46.180 +Well, part of writing an interpreter is writing a parser, + +00:23:46.580 --> 00:23:49.940 +something that understands the syntax of the programming languages + +00:23:50.130 --> 00:23:51.720 +and starts to get towards the semantics. + +00:23:52.180 --> 00:23:54.860 +And you might want to be able to write a parser later on + +00:23:55.000 --> 00:23:58.579 +for some very specific configuration type format that you've come up with + +00:23:58.800 --> 00:24:02.720 +for some, or maybe even just a file format for more of like an office type program. + +00:24:03.240 --> 00:24:08.140 +And you're going to need some way of understanding the techniques of how to parse that. Learning how + +00:24:08.140 --> 00:24:12.240 +an interpreter works will help you write that program later on. So it's about learning computational + +00:24:12.660 --> 00:24:16.760 +techniques, learning problem solving techniques more than it is about like something that's going to + +00:24:17.040 --> 00:24:20.700 +necessarily be, this is exactly how you do this for the next app you're going to build. + +00:24:20.940 --> 00:24:28.560 +It's not something that you would just stumble across most of the time, I think, right? You'll + +00:24:28.580 --> 00:24:32.920 +across, you know, oh, here's an ML library or something, but you don't typically stumble across + +00:24:33.260 --> 00:24:37.880 +and here's how you build a parser from scratch. Right. And so, you know, there has to be that + +00:24:38.080 --> 00:24:43.160 +curiosity there. So I will admit that, you know, if you have no interest and you're not the type + +00:24:43.160 --> 00:24:46.380 +of person who wants to understand how things work, you won't like this book. And you know, + +00:24:46.480 --> 00:24:50.540 +that is some folks and that's okay. There's folks who go into programming because they're only + +00:24:50.800 --> 00:24:55.500 +interested in what they can build and they're not so interested in how things work. And if that's + +00:24:55.520 --> 00:24:59.860 +you, this is probably not the book for you. But if you're the type of person who has that curiosity + +00:24:59.960 --> 00:25:04.660 +and you really want to understand how everything's actually working under the covers, then this is a + +00:25:04.660 --> 00:25:10.520 +great book. Yeah, absolutely. And it's a simple enough thing that you can grasp the ideas pretty + +00:25:10.700 --> 00:25:15.840 +quickly with this BF language, right? It's not so complicated that you, you know, a day later, + +00:25:15.980 --> 00:25:21.740 +still trying to make the thing parse. Absolutely. I mean, what's crazy is to interpret BF, + +00:25:22.200 --> 00:25:25.980 +you only need about 30 lines of code, 30 lines of Python. + +00:25:26.380 --> 00:25:30.520 +And then you actually have something that can run any program written in this language. + +00:25:31.240 --> 00:25:36.720 +And to be clear, you have like something.bf files that you can put the language into, + +00:25:36.860 --> 00:25:40.200 +and then you say Python, this module, that file, + +00:25:40.280 --> 00:25:43.600 +and it'll run it as if it were an interpreter for that thing, right? + +00:25:43.860 --> 00:25:44.520 +Exactly. Yeah. + +00:25:44.680 --> 00:25:47.740 +You're literally implementing the whole programming language in like 30 lines of Python. + +00:25:48.260 --> 00:25:50.100 +And I think what's great about this too, + +00:25:50.540 --> 00:25:52.960 +is it takes away the feeling that everything is magic. + +00:25:54.300 --> 00:25:56.280 +That's another thing I love when people read + +00:25:56.320 --> 00:25:58.500 +also class computer science problems in Python + +00:25:58.660 --> 00:26:01.620 +is sometimes when you think about how these things work, + +00:26:01.800 --> 00:26:04.200 +like in that book, we cover the A-star algorithm, + +00:26:04.260 --> 00:26:06.520 +which is something that Google Maps work uses. + +00:26:06.960 --> 00:26:08.080 +When you think about Google Maps, + +00:26:08.740 --> 00:26:10.360 +it feels like magic when you use it. + +00:26:10.920 --> 00:26:12.980 +But actually there's really understandable, + +00:26:13.200 --> 00:26:15.580 +logical algorithms that are underneath the surface. + +00:26:16.060 --> 00:26:16.780 +It's the same thing here. + +00:26:17.000 --> 00:26:19.260 +Python itself might feel like magic to a lot of folks. + +00:26:19.600 --> 00:26:26.540 +But by the time you get through these first couple chapters, especially through the basic interpreter chapter, you'll start to be on the road to think, oh, you know what? + +00:26:26.570 --> 00:26:32.280 +I bet I could dive into the CPython source code with enough additional training and really understand it. + +00:26:32.290 --> 00:26:35.280 +It gives you that confidence that this is not just magic. + +00:26:35.480 --> 00:26:39.480 +You just got to look at the byte codes and look at it go. + +00:26:40.100 --> 00:26:40.200 +Yeah. + +00:26:40.520 --> 00:26:44.540 +Not to say there's not a lot more there, but it just gets you on that journey and makes you see it's not magic. + +00:26:44.840 --> 00:26:45.060 +Right. + +00:26:45.280 --> 00:26:49.080 +Well, it's that zero to one sort of gap that's the hardest to cross. + +00:26:49.480 --> 00:26:49.580 +Yeah. + +00:26:49.960 --> 00:26:50.140 +Yeah. + +00:26:50.420 --> 00:26:54.980 +You know, like the second language, second programming language you learn or the third + +00:26:55.080 --> 00:26:58.080 +or the fourth, they only get easier to learn, not harder to learn. + +00:26:58.620 --> 00:27:01.640 +Whereas, you know, maybe when you're first starting out and you're trying to get something + +00:27:01.660 --> 00:27:04.860 +to compile and it won't even run, you're like, oh my God, how am I going to do? + +00:27:04.860 --> 00:27:05.820 +I can't even learn this one. + +00:27:06.000 --> 00:27:07.360 +There's all these things I'm going to have to know. + +00:27:07.540 --> 00:27:09.200 +And it's really kind of upside down. + +00:27:09.500 --> 00:27:09.980 +Yeah, absolutely. + +00:27:10.680 --> 00:27:15.320 +And so, you know, when we think about understanding how Python itself works, I think the second + +00:27:15.560 --> 00:27:19.360 +chapter of the book about basic gets us, you know, a lot further along on that journey. + +00:27:19.420 --> 00:27:21.580 +because here this isn't just a made up. + +00:27:22.160 --> 00:27:24.400 +BF is, it's a language that's been around + +00:27:24.440 --> 00:27:26.300 +in computer science education for like 30 years, + +00:27:26.720 --> 00:27:28.980 +but it's not a real language that people actually used. + +00:27:29.300 --> 00:27:30.100 +And the second factor- + +00:27:30.100 --> 00:27:33.280 +Very few people ever said that they were BF programmers + +00:27:33.280 --> 00:27:34.840 +at a dinner party, right? + +00:27:35.440 --> 00:27:35.880 +Absolutely. + +00:27:35.880 --> 00:27:38.540 +But plenty of people said they were basic programmers. + +00:27:38.540 --> 00:27:40.140 +And it was the first programming language + +00:27:40.140 --> 00:27:42.160 +of a lot of people who grew up in the 70s, 80s, + +00:27:42.160 --> 00:27:42.800 +or even the 90s. + +00:27:42.800 --> 00:27:44.660 +It was my first programming language. + +00:27:44.660 --> 00:27:47.680 +And there was a dialect of basic called tiny basic + +00:27:47.680 --> 00:27:49.120 +that came out in the late 1970s. + +00:27:49.160 --> 00:27:54.860 +it was actually one of the first free software projects. And it would run on machines that just + +00:27:55.040 --> 00:28:01.260 +had two or four kilobytes of RAM. So that's why it was called Tiny Basic. I mean, it truly was + +00:28:01.400 --> 00:28:06.700 +tiny. And so we re-implement a dialect of Tiny Basic in chapter two. So this is re-implementing + +00:28:06.700 --> 00:28:12.440 +a real programming language that people actually used for real work in the late 1970s and up to the + +00:28:12.520 --> 00:28:17.560 +early 1980s. And it can run real programs from that period. So you could go download a program + +00:28:17.560 --> 00:28:22.000 +from the late 70s, we were missing like one or two features from the real language. + +00:28:22.540 --> 00:28:25.560 +But if you have a pro, not all programs use all those features, you could actually run + +00:28:25.660 --> 00:28:26.340 +it in this interpreter. + +00:28:26.580 --> 00:28:30.940 +So we go from the first language where it's really esoteric, educational, you know, weird + +00:28:31.140 --> 00:28:32.520 +language to the second chapter. + +00:28:32.780 --> 00:28:33.560 +This was a real thing. + +00:28:33.820 --> 00:28:34.620 +Yeah, absolutely. + +00:28:34.660 --> 00:28:37.620 +It was, you know, I'm just going to give a little shout out to Visual Basic. + +00:28:38.080 --> 00:28:38.180 +Yeah. + +00:28:38.600 --> 00:28:39.020 +I don't know. + +00:28:39.260 --> 00:28:40.180 +Did you ever do Visual Basic? + +00:28:40.560 --> 00:28:40.920 +I did. + +00:28:41.020 --> 00:28:43.340 +And I did a version on the Mac called Real Basic. + +00:28:43.900 --> 00:28:45.460 +So this is really esoteric. + +00:28:45.540 --> 00:28:48.520 +But anyone who used real basic, late 90s, I'm in your camp. + +00:28:48.720 --> 00:28:49.160 +There you go. + +00:28:50.100 --> 00:28:56.920 +I don't know if we even today still have something as approachable and productive as Visual Basic was. + +00:28:57.340 --> 00:28:59.360 +For people who haven't used it, you're like, there's just no way. + +00:28:59.560 --> 00:29:00.600 +There's no way that it's basic. + +00:29:01.180 --> 00:29:02.700 +But you would just get a visual thing. + +00:29:02.700 --> 00:29:07.560 +You would drag it together to build your, you know, here, I want to have, literally, you would drag over. + +00:29:07.740 --> 00:29:08.600 +Here's a web browser. + +00:29:08.920 --> 00:29:09.780 +Here's the address bar. + +00:29:09.940 --> 00:29:10.620 +Here's a go button. + +00:29:11.040 --> 00:29:14.760 +And then you would double click the go button and would create an event handler. + +00:29:14.820 --> 00:29:19.320 +and then you would go webbrowser.gototext.value or whatever. + +00:29:19.520 --> 00:29:21.240 +I mean, that was literally, you could do it in five minutes. + +00:29:21.340 --> 00:29:22.560 +You could like create something + +00:29:22.700 --> 00:29:24.940 +that is a functional web browser without much. + +00:29:25.480 --> 00:29:26.280 +It was incredible. + +00:29:26.560 --> 00:29:28.340 +And so, yeah, I'm just thinking back + +00:29:28.500 --> 00:29:29.620 +to a few things I've built with it. + +00:29:29.620 --> 00:29:30.060 +It was amazing. + +00:29:30.380 --> 00:29:32.000 +You know, Michael, a lot of people agree with you. + +00:29:32.120 --> 00:29:35.060 +There's a lot of articles that I've seen on blogs and stuff + +00:29:35.140 --> 00:29:36.880 +where people reminisce about Visual Basic. + +00:29:37.460 --> 00:29:38.360 +And I agree. + +00:29:38.520 --> 00:29:40.200 +I mean, for desktop app development, + +00:29:40.660 --> 00:29:42.340 +it was incredibly productive. + +00:29:42.380 --> 00:29:47.820 +I mean, I think it still rivals some of the tools we have for building web apps today. + +00:29:47.980 --> 00:29:50.400 +When you think about how easy it was to lay out a user interface. + +00:29:51.100 --> 00:29:55.260 +And for designers, it was great too, because designers didn't need to know how to code. + +00:29:55.600 --> 00:29:59.140 +And they could lay out the interface in the same way that it would really appear in the program, + +00:29:59.640 --> 00:30:02.940 +which is different from how designers work today, where they'll often do mock-ups. + +00:30:03.220 --> 00:30:06.040 +And the developer will have to take the mock-up and turn it into code. + +00:30:06.660 --> 00:30:12.360 +And so you kind of lose something there with the designer being able to have the final product + +00:30:12.380 --> 00:30:14.660 +in front of them as they're changing around how things look. + +00:30:15.520 --> 00:30:17.160 +So there were elements of it + +00:30:17.260 --> 00:30:18.620 +that were still missing today, I think. + +00:30:18.820 --> 00:30:20.940 +- Yeah, I think it really hasn't been matched. + +00:30:21.100 --> 00:30:24.120 +Windows forms from.NET kind of approached that, + +00:30:24.300 --> 00:30:25.700 +but it was still, itself was also + +00:30:25.730 --> 00:30:26.900 +a little bit more complicated. + +00:30:27.440 --> 00:30:29.000 +There was something special about that. + +00:30:29.420 --> 00:30:31.020 +And now, don't get me wrong, + +00:30:31.200 --> 00:30:33.000 +it's not like I'm saying we should just go back to it + +00:30:33.000 --> 00:30:35.200 +because the software we build is way more advanced, + +00:30:35.580 --> 00:30:36.380 +does a lot of other things, + +00:30:36.570 --> 00:30:39.800 +but there's just that lower area is just kind of missing. + +00:30:41.720 --> 00:30:44.240 +This portion of Talk Python To Me is brought to you by NordStellar. + +00:30:44.880 --> 00:30:48.600 +NordStellar is a threat exposure management platform from the Nord security family, + +00:30:48.820 --> 00:30:52.460 +the folks behind NordVPN that combines dark web intelligence, + +00:30:53.060 --> 00:30:55.900 +session hijacking prevention, brand abuse detection, + +00:30:56.260 --> 00:30:58.100 +and external attack service management. + +00:30:58.580 --> 00:31:01.920 +Keeping your team and your company secure is a daunting challenge. + +00:31:02.580 --> 00:31:04.600 +That's why you need NordStellar on your side. + +00:31:04.880 --> 00:31:08.120 +It's a comprehensive set of services, monitoring, and alerts + +00:31:08.140 --> 00:31:11.000 +to limit your exposure to breaches and attacks + +00:31:11.660 --> 00:31:13.920 +and act instantly if something does happen. + +00:31:14.420 --> 00:31:14.980 +Here's how it works. + +00:31:15.480 --> 00:31:17.400 +Nordstellar detects compromised employee + +00:31:17.800 --> 00:31:18.780 +and consumer credentials. + +00:31:19.440 --> 00:31:21.920 +It detects stolen authentication cookies + +00:31:22.200 --> 00:31:25.060 +found in InfoStealer logs and dark web sources + +00:31:25.520 --> 00:31:27.040 +and flags compromised devices, + +00:31:27.660 --> 00:31:31.300 +reducing MFA bypass ATOs without extra code in your app. + +00:31:31.700 --> 00:31:33.300 +Nordstellar scans the dark web + +00:31:33.480 --> 00:31:35.340 +for cyber threats targeting your company. + +00:31:36.080 --> 00:31:41.820 +It monitors forums, markets, ransomware blogs, and over 25,000 cybercrime telegram channels + +00:31:42.170 --> 00:31:46.420 +with alerting and searchable contexts you can route to Slack or your IRR tool. + +00:31:47.100 --> 00:31:49.440 +Nordstellar adds brand and domain protection. + +00:31:50.060 --> 00:31:56.080 +It detects cyber squats and lookalikes via visual, content similarity, and search transparency logs, + +00:31:56.320 --> 00:32:03.180 +plus broader brand abuse takedowns across the web, social, and app stores to cut the phishing risk for your users. + +00:32:03.680 --> 00:32:05.540 +They don't just alert you about impersonation. + +00:32:05.760 --> 00:32:07.440 +they file and manage the removals. + +00:32:08.000 --> 00:32:10.120 +Finally, NordStellar is developer-friendly. + +00:32:10.720 --> 00:32:13.320 +It's available as a platform and an API. + +00:32:14.000 --> 00:32:14.960 +No agents to install. + +00:32:15.420 --> 00:32:17.360 +If security is important to you and your organization, + +00:32:17.720 --> 00:32:18.540 +check out NordStellar. + +00:32:19.020 --> 00:32:21.260 +Visit talkpython.fm/nordstellar. + +00:32:21.400 --> 00:32:23.120 +The link is in your podcast player's show notes + +00:32:23.400 --> 00:32:24.300 +and on the episode page. + +00:32:24.760 --> 00:32:27.440 +Please use our link, talkpython.fm/nordstellar, + +00:32:27.880 --> 00:32:30.940 +so that they know that you heard about their service from us. + +00:32:31.300 --> 00:32:32.420 +And you know what time of year it is. + +00:32:32.680 --> 00:32:33.740 +It's late fall. + +00:32:34.160 --> 00:32:36.420 +That means Black Friday is in play as well. + +00:32:36.500 --> 00:32:40.480 +So the folks at Nord Stellar gave us a coupon, BlackFriday20. + +00:32:41.000 --> 00:32:44.080 +That's Black Friday, all one word, all caps, 20, two, zero. + +00:32:44.580 --> 00:32:45.960 +That grants you 20% off. + +00:32:45.980 --> 00:32:47.740 +So if you're going to sign up for them soon, + +00:32:48.320 --> 00:32:50.520 +go ahead and use BlackFriday20 as a code, + +00:32:50.720 --> 00:32:52.060 +and you might as well save 20%. + +00:32:52.180 --> 00:32:54.680 +It's good until December 10th, 2025. + +00:32:55.620 --> 00:32:57.580 +Thank you to the whole Nord security team + +00:32:57.820 --> 00:32:58.960 +for supporting Talk Python To Me. + +00:33:00.200 --> 00:33:01.020 +And I want to talk, + +00:33:01.220 --> 00:33:03.740 +sort of transition from that to something else really. + +00:33:03.820 --> 00:33:08.100 +We're looking at these two examples of the BF language interpreter and the basic interpreter. + +00:33:08.480 --> 00:33:14.560 +I hear that to really understand computer science, really to work on these things, I have to. + +00:33:14.710 --> 00:33:15.780 +I don't, it's not preferable. + +00:33:15.850 --> 00:33:19.580 +I have to do a language with pointers, malloc, free. + +00:33:20.020 --> 00:33:20.740 +I've got to. + +00:33:21.080 --> 00:33:21.840 +I've got to work at that level. + +00:33:22.140 --> 00:33:23.100 +I just won't understand anything. + +00:33:23.820 --> 00:33:26.020 +And Python, we don't really have those concepts. + +00:33:26.360 --> 00:33:31.100 +And the irony, I think, is Python has more pointers than C++ because there's like no stack at all. + +00:33:31.270 --> 00:33:31.960 +Not at all. + +00:33:32.520 --> 00:33:32.880 +Really? + +00:33:32.900 --> 00:33:37.460 +I mean, in the interpreter there is, but not in your writing. Even a number one is just a pointer. + +00:33:37.880 --> 00:33:41.280 +So Python's full of pointers, but not in the way that computer science thinks about it. + +00:33:41.660 --> 00:33:45.880 +What are your thoughts about that sort of tension? On one hand, you have this really + +00:33:46.060 --> 00:33:52.560 +understandable language talking about these ideas, but the computer is calling malloc on a page of + +00:33:52.680 --> 00:33:57.180 +memory and that's what's happening and they're not seeing it. Okay. So let me talk about it from a + +00:33:57.200 --> 00:34:02.660 +pedagogy standpoint. So at my last institution, Champlain College, we had a big debate over my + +00:34:02.840 --> 00:34:08.679 +nine years there. Should we do our first three classes in Python or in C++? And when I came in, + +00:34:08.800 --> 00:34:12.700 +the first three classes were in C++. And we actually decided over the years to keep it there + +00:34:13.040 --> 00:34:17.020 +for exactly what you mentioned. We wanted to give students both that high level experience with + +00:34:17.120 --> 00:34:21.280 +object-oriented programming, but we also wanted them to have experience with pointers, with memory + +00:34:21.540 --> 00:34:26.340 +management, and understand how things work at a low level. But in that same period of time, many + +00:34:26.280 --> 00:34:30.179 +schools have moved to Python because of the other thing we talked about, which is accessibility. + +00:34:31.200 --> 00:34:37.280 +I think Python simply is an easier language to learn than C++. I don't think that most people + +00:34:37.320 --> 00:34:42.580 +who know both languages would really debate that. And so if we're trying to make those first ramp + +00:34:42.700 --> 00:34:48.360 +up classes where you're first learning CS as easy as possible, I think Python is the way to go if + +00:34:48.360 --> 00:34:51.659 +we want to encourage more people into the discipline. That doesn't mean there shouldn't + +00:34:51.679 --> 00:35:02.360 +Maybe a C or C++ course later on where folks, maybe when they take operating systems or even as part of an architecture class to see how the assembly language matches to a C program, that kind of thing. + +00:35:03.320 --> 00:35:06.660 +You know, it doesn't mean there shouldn't be C or C++ in the curriculum or Rust or whatever. + +00:35:07.660 --> 00:35:13.920 +But if we're thinking about what's best for a student who's just coming into the field, I think we need to think about accessibility. + +00:35:15.020 --> 00:35:20.940 +But at the same time, my advice to all students is learn one language well before you learn any other language. + +00:35:21.420 --> 00:35:31.540 +So whether you're starting with Python or you're starting with C++, spend a year or two on it and become really decent at it before you go and learn another language. + +00:35:31.980 --> 00:35:34.740 +And that's the biggest mistake I see folks who are self-taught make. + +00:35:35.160 --> 00:35:39.500 +The biggest mistake I see folks who are self-taught make is constantly switch around from language to language. + +00:35:39.960 --> 00:35:40.880 +I need to know this. + +00:35:41.100 --> 00:35:41.800 +Okay, I got it. + +00:35:41.900 --> 00:35:43.020 +Now it's time to learn this. + +00:35:43.120 --> 00:35:45.440 +And they're trying to fill all these gaps, right? + +00:35:45.940 --> 00:35:46.340 +Right, right. + +00:35:46.480 --> 00:35:51.040 +Because once you learn Python well, a lot of the stuff in C or C++ will make sense to you. + +00:35:51.040 --> 00:35:55.340 +The pointers might not, but a lot of the other stuff, like how does a function work? + +00:35:55.380 --> 00:35:56.120 +How does a loop work? + +00:35:56.560 --> 00:35:57.300 +What are variables? + +00:35:57.980 --> 00:35:58.900 +What's a global versus local? + +00:35:58.960 --> 00:35:59.620 +All that kind of stuff. + +00:36:00.220 --> 00:36:01.340 +That's going to make sense to you. + +00:36:01.340 --> 00:36:05.660 +And that's going to be transferable skills once you've really learned it in any one language + +00:36:05.880 --> 00:36:06.700 +across any other language. + +00:36:06.820 --> 00:36:08.460 +And then pointers, you can learn later on. + +00:36:08.460 --> 00:36:11.440 +You don't need to learn that at the beginning of your CS education. + +00:36:12.960 --> 00:36:15.200 +People make it sound like it's a totally mystical topic. + +00:36:16.020 --> 00:36:22.660 +If you are able to do calculus, which basically every CS degree requires calculus one, you can learn pointers. + +00:36:23.440 --> 00:36:24.120 +You'll be okay. + +00:36:24.240 --> 00:36:25.180 +You can learn pointers. + +00:36:26.040 --> 00:36:26.740 +Yeah, exactly. + +00:36:27.000 --> 00:36:29.900 +Like double integrals and all that kind of stuff is way worse. + +00:36:30.280 --> 00:36:32.120 +I feel like you could also cycle, right? + +00:36:32.360 --> 00:36:38.840 +So you could start with Python for a couple of classes, then go deeper, closer to the machine with C. + +00:36:38.980 --> 00:36:45.800 +But then you could come back and say, let's look at Python again with new eyes and try to understand interpreted dynamic languages better. + +00:36:45.860 --> 00:36:48.720 +because now you can take the red pill + +00:36:48.920 --> 00:36:51.360 +and you can see the arenas + +00:36:51.500 --> 00:36:53.620 +and the blocks of the memory allocator + +00:36:53.880 --> 00:36:56.540 +and the GC and all that kind of stuff, right? + +00:36:56.640 --> 00:36:57.780 +You could go, actually, + +00:36:58.280 --> 00:36:59.240 +you didn't know any of this stuff. + +00:36:59.240 --> 00:37:00.980 +You didn't, nobody probably even thought + +00:37:01.060 --> 00:37:02.220 +to think is this here + +00:37:02.700 --> 00:37:03.800 +and yet look at what's underneath + +00:37:04.120 --> 00:37:05.100 +that you're taking, + +00:37:05.240 --> 00:37:06.740 +you're building on top of, right? + +00:37:07.020 --> 00:37:08.880 +Yeah, and also remember how long + +00:37:08.920 --> 00:37:10.180 +an academic semester is. + +00:37:10.220 --> 00:37:11.060 +It's 15 weeks. + +00:37:11.480 --> 00:37:12.860 +So if you're taking a class in college, + +00:37:13.040 --> 00:37:15.240 +you're forced to be doing that same language + +00:37:15.260 --> 00:37:19.440 +for 15 weeks. That's really the challenge for self-taught folks is they could just spend two + +00:37:19.440 --> 00:37:22.820 +days on one, two days on another, right? But absolutely, you're right. Once you've had enough + +00:37:23.000 --> 00:37:27.720 +time in one, you can cycle to the others back and forth. Yeah, fun. I personally am a fan of + +00:37:27.990 --> 00:37:33.680 +it being in Python because I feel like one of the biggest challenges to keeping people in computer + +00:37:33.860 --> 00:37:40.700 +science and programming is that they don't get enough early wins and early like, yes, aha, right? + +00:37:40.860 --> 00:37:45.980 +It's like, okay, in week 12, we'll let you write a program, but we're going to talk pointers for a while. + +00:37:46.360 --> 00:37:48.300 +And then you'll finally get to make some, you know what I mean? + +00:37:48.380 --> 00:37:53.540 +Like it's just so delayed and that's fine for certain people, but there's a lot of people are like, oh, forget this. + +00:37:54.140 --> 00:37:55.260 +This is not what I thought. + +00:37:55.260 --> 00:37:55.660 +I'm out. + +00:37:55.840 --> 00:37:59.300 +And a huge part of that is the Python library ecosystem. + +00:37:59.760 --> 00:38:05.960 +How easy it is to drop Pygame into an intro class and get somebody building a really simple game. + +00:38:06.880 --> 00:38:14.400 +You know, it's much harder in C or C++ to start integrating libraries and start having students understand how to use those libraries. + +00:38:14.540 --> 00:38:20.200 +And it usually requires a lot more knowledge buildup to be able to use pointers and stuff with those libraries. + +00:38:20.650 --> 00:38:22.440 +So Python just makes everything more accessible. + +00:38:22.880 --> 00:38:23.080 +Awesome. + +00:38:23.480 --> 00:38:23.540 +Cool. + +00:38:23.800 --> 00:38:24.060 +All right. + +00:38:24.400 --> 00:38:30.720 +The next area that was pretty interesting and a little bit, maybe a little artsy. + +00:38:31.100 --> 00:38:31.280 +Yeah. + +00:38:31.500 --> 00:38:33.000 +Is the computational art. + +00:38:33.180 --> 00:38:33.820 +Tell us about that. + +00:38:33.860 --> 00:38:39.480 +Yeah. So there's two chapters in the book on computational art. The first one is really + +00:38:39.720 --> 00:38:44.620 +starting to understand what a pixel is on the screen. And the way we do that is we take modern + +00:38:44.890 --> 00:38:50.760 +photos and then we want to display them on like an ancient Mac, a Mac from the 1980s, + +00:38:51.080 --> 00:38:56.500 +like the Mac Plus. And so we're going to take that modern photo and use what's called a dithering + +00:38:56.680 --> 00:39:02.520 +algorithm to break it down into patterns of black and white pixels because those early Macs were + +00:39:02.400 --> 00:39:06.720 +only black and white, that will then still kind of look like the photo in black and white with + +00:39:06.740 --> 00:39:10.920 +these specialized patterns. So you're learning a bunch of things by doing this. One thing you're + +00:39:11.020 --> 00:39:15.060 +learning is you're learning really what is a pixel. And a pixel is really pretty simple at its base. + +00:39:15.180 --> 00:39:19.980 +I mean, it's just a color and a location. And so understanding how those pixels are organized + +00:39:20.800 --> 00:39:26.140 +and really they're just organized usually as an array or, you know, as a list in Python or a + +00:39:26.200 --> 00:39:32.360 +numpy array or whatever. And then we're understanding some algorithmic stuff. So + +00:39:32.380 --> 00:39:34.540 +is giving us some algorithmic practice. + +00:39:35.120 --> 00:39:36.320 +And then a cool thing we do at the end + +00:39:36.320 --> 00:39:38.360 +is we actually convert it into a file format + +00:39:38.780 --> 00:39:42.620 +called Mac Paint for displaying on an ancient Mac. + +00:39:42.940 --> 00:39:44.940 +And that Mac Paint format uses + +00:39:45.420 --> 00:39:47.600 +a very simple compression scheme + +00:39:47.880 --> 00:39:48.900 +called run length encoding. + +00:39:49.320 --> 00:39:51.440 +So we're getting some other kind of algorithmic practice + +00:39:51.590 --> 00:39:52.320 +in there as well, + +00:39:52.900 --> 00:39:54.620 +understanding a simple compression algorithm. + +00:39:55.020 --> 00:39:57.260 +And we're understanding something about file formats too, + +00:39:57.560 --> 00:40:00.560 +which is really kind of an interesting CS topic as well. + +00:40:00.570 --> 00:40:01.380 +Yeah, it sure is, yeah. + +00:40:01.400 --> 00:40:06.200 +to properly format the file so the Mac Paint program on the ancient Mac will open it correctly. + +00:40:06.560 --> 00:40:11.040 +So yeah, that was something actually when I first got into programming, I feel like I just stuck + +00:40:11.100 --> 00:40:16.920 +with text oriented files pretty much as long as I could. Because, you know, looking at a binary file, + +00:40:17.060 --> 00:40:22.540 +like, okay, it has a header and we read the first few bytes and then the value of that byte tells + +00:40:22.560 --> 00:40:26.880 +you how long the rest of the header is and then like what this means. And then you skip. And I + +00:40:26.900 --> 00:40:31.640 +don't know. That was just, it was a bridge too far for me in my early programming days. And I was + +00:40:31.800 --> 00:40:35.640 +like, wow, this is intense. And I was in awe of people like, yeah, we just read the header and we + +00:40:35.680 --> 00:40:42.620 +do this. I'm like, okay, if you say so. You know, it's a classic CS trade-off between time and space. + +00:40:43.120 --> 00:40:48.260 +Anything that we do in a binary file, we could do in a text file. But binary files can be more + +00:40:48.400 --> 00:40:53.960 +efficient because they can be more compact and they can be faster to read from for certain kinds + +00:40:53.980 --> 00:41:00.200 +of data. So, you know, it's not that we have to use binary files, but understanding what a binary + +00:41:00.360 --> 00:41:05.600 +file format is like can be eye-opening to some readers. And if you think about like modern file + +00:41:05.800 --> 00:41:09.600 +formats, of course, text formats are much more explainable, right? That's why we have the rise + +00:41:09.600 --> 00:41:15.240 +of things like XML in the late nineties up to today, or even JSON as a data interchange format. + +00:41:15.500 --> 00:41:20.160 +There's alternatives to JSON, of course, that are much more efficient for coding your web app in + +00:41:20.180 --> 00:41:25.800 +for distributing the data back and forth. But JSON is human readable. Yeah, yeah, yeah. But JSON is + +00:41:25.920 --> 00:41:31.360 +human readable. So we can right away understand what it's supposed to represent and debug it really + +00:41:31.680 --> 00:41:34.780 +well. And so it's, you know, it's one of those classic trade-offs. We can have something more + +00:41:34.920 --> 00:41:40.460 +efficient or we can have something that might take up a little bit more time, but actually, + +00:41:40.880 --> 00:41:44.980 +you know, is better for us as human beings. Yeah. It's just, I think it's a good skill. + +00:41:45.060 --> 00:41:53.820 +And also something that was prevalent throughout the book is juggling bits, bits and bytes, not just juggling bytes, but bits and bit flipping and shifting. + +00:41:54.110 --> 00:41:59.180 +And there's a lot of that going on, especially in the emulator layers and stuff like that. + +00:41:59.480 --> 00:42:04.540 +Yeah. And even in the Mac paint chapter, because the way Mac paint stores pixels is as individual bits. + +00:42:05.040 --> 00:42:09.080 +So you have like a one or a zero representing a black or a white pixel on the screen. + +00:42:09.540 --> 00:42:22.600 +And so you have to find a way to take those bits and compact them into bytes and then run the right run length encoding algorithm compress it. It's yeah, but there's a lot of that. And, you know, when you do really low level work in computing, you need to understand bits and bytes. + +00:42:23.060 --> 00:42:31.300 +If you're going to work on device drivers or operating systems or file formats, you really need to understand this stuff at an intimate level. + +00:42:31.520 --> 00:42:42.420 +So we try to make fun projects in the book that you get something that's interesting as a way of making this a little of sugar to help the medicine go down kind of thing. + +00:42:43.480 --> 00:42:45.620 +And we were talking about the computer graphics chapters. + +00:42:45.800 --> 00:42:50.240 +I also wanted to mention chapter four because I love the program in that called Impressionist. + +00:42:50.300 --> 00:42:56.320 +It makes images that look like an impressionist painter painted a photograph. + +00:42:56.850 --> 00:43:01.260 +So you give it a photograph and then it builds kind of abstract art out of that photograph. + +00:43:01.470 --> 00:43:06.820 +And people usually think you need a neural network for that or you need some kind of really advanced machine learning algorithm for that. + +00:43:07.440 --> 00:43:11.560 +But actually, we show in the chapter that you can do it using a pretty simple algorithm. + +00:43:11.580 --> 00:43:23.160 +All the algorithm does is it puts a vector shape on the screen and it tries to position the vector shape so that it overlaps a region of color on the original photo that is close to the color in the vector shape. + +00:43:23.460 --> 00:43:30.080 +And if you keep doing that and you put enough vector shapes on the screen, you start to have like abstract shapes that look like the original photo. + +00:43:30.540 --> 00:43:38.820 +And so I think that's that chapter is kind of powerful because it shows you how a simple technique, a simple computational technique can really have some pretty powerful output. + +00:43:38.940 --> 00:43:42.640 +it. Yeah, it's a really interesting idea and it comes out looking great. It's sort of, + +00:43:42.800 --> 00:43:46.500 +it's approaching the problem from a different perspective, which I suspect is probably a pretty + +00:43:46.660 --> 00:43:52.200 +interesting CS lesson. You know, there's these problems that are incredibly expensive and + +00:43:52.560 --> 00:43:57.920 +difficult to compute the one true answer, but then there's amazingly fast ways to get a, + +00:43:58.240 --> 00:44:02.340 +that's pretty much it, answer. I'm thinking like Monte Carlo simulations and stuff like that. + +00:44:02.540 --> 00:44:08.900 +You're like, this could take two weeks or three milliseconds, which would you prefer? You're + +00:44:08.920 --> 00:44:12.380 +And, you know, that that's we talked about, like, why is the book in Python? + +00:44:12.940 --> 00:44:20.660 +And, you know, that is one of the challenges of writing the book in Python is simply Python doesn't have great performance when you write in pure Python. + +00:44:21.540 --> 00:44:28.140 +And so, you know, we've all seen the benchmarks where Python's like 50 or 70 times slower than C on certain benchmarks. + +00:44:28.480 --> 00:44:38.060 +Right. And yeah, honestly, for some of the programs in this book, you really see that performance deficit you have with Python, like the abstract art chapter, + +00:44:38.140 --> 00:44:42.880 +where if I wrote that same program in C using some C graphics library instead of in Python + +00:44:43.300 --> 00:44:47.360 +with Pillow, even though Pillow, I think, is mostly implemented in C anyway, but still just + +00:44:47.450 --> 00:44:53.840 +with the overhead of having our algorithmic part of it in Python, that program is probably 30, + +00:44:54.020 --> 00:44:58.860 +50 times slower than it would be in C. So you have to wait like 20 minutes to see that abstract art + +00:44:59.300 --> 00:45:03.840 +that would have been like less than a minute in a C program. Not that it's relevant to the book or + +00:45:03.860 --> 00:45:11.580 +of the courses, but you probably could bring in some optimizations like Cython or Numba, + +00:45:12.120 --> 00:45:17.800 +a couple of the things that'll take just the inner two loops and just make them go different, + +00:45:18.040 --> 00:45:22.920 +but it's not the point of the book. That's true. And so I put that as an exercise for the reader + +00:45:23.280 --> 00:45:28.160 +in the NES emulator chapter, because the NES emulator chapter where we actually build up + +00:45:28.400 --> 00:45:32.760 +a real NES emulator. So I didn't write in the book for legal reasons, but it can actually play + +00:45:32.800 --> 00:45:36.880 +Donkey Kong. So it can play Donkey Kong from the original Nintendo Entertainment System. + +00:45:36.980 --> 00:45:40.400 +Let's talk about the NES, the Nintendo Entertainment System. + +00:45:40.840 --> 00:45:45.380 +Yeah. So, I mean, the NES, a lot of us remember growing up with, and if you're of a younger + +00:45:45.640 --> 00:45:51.420 +generation, this was like the main video game system of the 1980s to early 1990s that everyone + +00:45:51.580 --> 00:45:56.880 +had. It's where the original Super Mario came out, the original Legend of Zelda. And, you know, + +00:45:57.040 --> 00:46:01.400 +it's actually, of course, like all video game systems, it's a computer. And being a video game + +00:46:01.440 --> 00:46:06.680 +system for the 1980s. It's a pretty simple computer, actually. It has a 6502 microprocessor, + +00:46:07.100 --> 00:46:12.320 +which is the same microprocessor that was in the Apple II or the Commodore 64 or the TRS, + +00:46:12.380 --> 00:46:21.120 +I think the TRS-80 also. And that microprocessor, it only has 56 instructions, the NES version of it. + +00:46:21.440 --> 00:46:25.600 +And so you can write an interpreter for that microprocessor pretty compactly, + +00:46:26.020 --> 00:46:31.380 +not quite as compact as the BF interpreter in chapter one, but compactly enough that it's only + +00:46:31.400 --> 00:46:37.820 +lines of Python to be able to really write a simulator of that CPU. And so the NES is just + +00:46:37.980 --> 00:46:43.400 +that CPU plus a graphics processor called the picture processing unit plus an audio processor. + +00:46:43.840 --> 00:46:49.320 +So in the chapter, we implement the full CPU. We implement a simplified version of the PPU, + +00:46:49.720 --> 00:46:53.700 +not advanced enough to run Super Mario Brothers, but it is advanced enough to run Donkey Kong. + +00:46:54.599 --> 00:46:59.240 +And we don't implement the APU. So the audio processing unit is a little more complicated + +00:46:59.260 --> 00:47:00.320 +We don't do the audio. + +00:47:00.860 --> 00:47:02.120 +But we do all of this in the chapter + +00:47:02.200 --> 00:47:03.800 +so you can run real NES games. + +00:47:04.600 --> 00:47:06.440 +And what you're getting is you're getting to understand + +00:47:06.720 --> 00:47:07.900 +that software hardware interface. + +00:47:08.120 --> 00:47:12.320 +You're starting to understand how does our software + +00:47:12.920 --> 00:47:14.420 +actually get run on hardware + +00:47:14.920 --> 00:47:17.760 +by re-implementing what a CPU actually does. + +00:47:18.120 --> 00:47:19.740 +And you're re-implementing the memory map too. + +00:47:19.840 --> 00:47:21.540 +So how does a CPU connect to memory? + +00:47:21.940 --> 00:47:23.940 +How does a CPU connect to the graphics processor? + +00:47:24.680 --> 00:47:26.480 +You're doing all of that in that chapter. + +00:47:26.940 --> 00:47:27.700 +And so it's kind of like- + +00:47:27.940 --> 00:47:34.260 +Sure. You talked about like writing an interpreter for the BF in basic languages. Here you're + +00:47:34.600 --> 00:47:41.980 +completely emulating the hardware of the NES and you're giving it the potentially given it + +00:47:42.440 --> 00:47:45.160 +open source NES games and it can run them, right? + +00:47:45.520 --> 00:47:49.840 +Correct. Yeah. It runs several NES open source games. Like I said, I didn't put it for legal + +00:47:50.000 --> 00:47:54.280 +reasons, but it can run real commercial games as well for the NES. And so you're doing the whole + +00:47:54.300 --> 00:47:59.920 +soup to nuts like entire system except the audio. But it is Python and it's pure Python, except for, + +00:48:00.000 --> 00:48:05.040 +of course, the library we're using. We're using Pygame for displaying the window, which is written + +00:48:05.090 --> 00:48:12.020 +a lot of in C. But because it's pure Python, the emulator doesn't run at full speed. So it runs on + +00:48:12.260 --> 00:48:18.180 +my Mac. It runs at about 15 frames per second. The real NES ran at 60 frames per second. So we leave + +00:48:18.180 --> 00:48:24.260 +as an exercise to the reader. Yeah, go use Cython or something like that. And I'm sure you can get + +00:48:24.280 --> 00:48:25.540 +up with several different techniques, + +00:48:25.620 --> 00:48:26.940 +not just with, definitely with Cython, + +00:48:27.280 --> 00:48:28.160 +but with several different techniques, + +00:48:28.300 --> 00:48:29.920 +you could get this up to 60 frames per second, + +00:48:29.960 --> 00:48:31.140 +but you're going to have to incorporate + +00:48:32.520 --> 00:48:34.560 +something that is, you know, using, + +00:48:35.760 --> 00:48:37.140 +that gets outside of just pure Python. + +00:48:37.420 --> 00:48:37.640 +Sure. + +00:48:38.000 --> 00:48:39.640 +You know, one, there's a bunch of ways + +00:48:39.760 --> 00:48:40.200 +that you already mentioned, + +00:48:40.280 --> 00:48:43.400 +but one new way that's kind of officially new, + +00:48:43.740 --> 00:48:46.300 +just this month is free-threaded Python. + +00:48:46.800 --> 00:48:47.460 +And I've been doing, + +00:48:47.640 --> 00:48:50.020 +I went back to my async programming course + +00:48:50.440 --> 00:48:52.140 +where there's a lot of examples of like, + +00:48:52.260 --> 00:48:53.220 +let's do this synchronously. + +00:48:53.360 --> 00:48:54.460 +Now let's do it with threads. + +00:48:54.600 --> 00:48:55.780 +Let's do it with multiprocessing. + +00:48:55.790 --> 00:48:57.260 +Now let's do it with asyncio, right? + +00:48:57.260 --> 00:48:58.580 +And see like how those all compare. + +00:48:58.920 --> 00:49:00.700 +And the threads one, anything was computational. + +00:49:01.000 --> 00:49:02.040 +It's like, yeah, it's the same speed + +00:49:02.090 --> 00:49:03.440 +or maybe even slower, right? + +00:49:03.760 --> 00:49:08.200 +But then I ran it with, I did uv run --Python + +00:49:08.590 --> 00:49:11.620 +3.14 T and it went 10 times faster + +00:49:12.080 --> 00:49:14.280 +or eight times faster on my 10 core machine. + +00:49:14.820 --> 00:49:16.320 +And it's literally just running on the, + +00:49:16.370 --> 00:49:19.140 +I didn't change the codes from pre-threaded Python + +00:49:19.360 --> 00:49:20.040 +and it just took off. + +00:49:20.240 --> 00:49:29.540 +So I'm thinking here there's graphics amongst many things is primed for this like embarrassingly parallel processing. + +00:49:29.930 --> 00:49:30.060 +Yeah. + +00:49:30.060 --> 00:49:34.880 +You could break up the screen into like little chunks and go, well, we got 10 chunks and 10 cores. + +00:49:35.460 --> 00:49:35.940 +Let's go at it. + +00:49:36.340 --> 00:49:38.080 +What do you think about it as a plausibility? + +00:49:39.220 --> 00:49:46.180 +Unfortunately, for a lot of emulators, especially emulators that are for more modern systems, threading would be a huge advantage. + +00:49:46.620 --> 00:49:50.360 +For the NES, it's not because of the way that everything was timed. + +00:49:51.080 --> 00:49:59.960 +So every time that the CPU does one cycle, the PPU, the picture processing unit, has to be timed to do exactly three cycles. + +00:50:00.540 --> 00:50:04.700 +So you can't just go ahead and, you know, and you don't just give it work. + +00:50:04.780 --> 00:50:07.440 +It's not like a modern GPU where you just give it a bunch of chunks of work. + +00:50:07.450 --> 00:50:10.280 +It works on its own and then provides that to the screen. + +00:50:10.740 --> 00:50:12.640 +It's completely synchronized to the CPU. + +00:50:13.120 --> 00:50:15.060 +And so is the audio processing unit, too. + +00:50:15.240 --> 00:50:18.020 +So it all has to kind of be synchronized within a single thread. + +00:50:18.300 --> 00:50:20.780 +And all the games assume that that's how it works, right? + +00:50:20.980 --> 00:50:21.980 +That's how they have it all set. + +00:50:22.050 --> 00:50:22.180 +Yeah. + +00:50:22.580 --> 00:50:22.760 +Yeah. + +00:50:22.940 --> 00:50:23.000 +Yeah. + +00:50:23.140 --> 00:50:28.120 +And what they had, what was called a V blank period, which was a period where it had already + +00:50:28.340 --> 00:50:32.160 +drawn the screen and then it actually tells the CPU, hey, I'm done drawing the screen. + +00:50:32.340 --> 00:50:35.700 +Now you can do all kinds of work and be ready for the next time that you draw the next frame + +00:50:35.790 --> 00:50:35.900 +again. + +00:50:36.550 --> 00:50:40.360 +So they had really tight synchronization between the graphics and the CPU. + +00:50:40.880 --> 00:50:41.100 +Interesting. + +00:50:41.450 --> 00:50:41.840 +Still cool. + +00:50:42.160 --> 00:50:42.740 +Still pretty cool. + +00:50:43.220 --> 00:50:44.300 +There's a lot going on there. + +00:50:44.360 --> 00:50:47.820 +If you want to understand hardware, that's a pretty low level look at it, right? + +00:50:48.080 --> 00:50:50.180 +I think it's kind of the crown jewel of the book. + +00:50:50.460 --> 00:50:53.700 +You know, there's a lot of people who are interested in emulators. + +00:50:54.080 --> 00:51:00.400 +And what I found before writing the book is there's not a single book out there that talks + +00:51:00.620 --> 00:51:04.040 +about writing an NES emulator, which is the most common emulator that people want to write. + +00:51:04.420 --> 00:51:07.700 +So I think this is the first book that has a chapter on writing an NES emulator. + +00:51:08.020 --> 00:51:12.020 +I will admit I was a little scared of like if Nintendo's legal team is going to like + +00:51:12.360 --> 00:51:14.080 +be upset about the book or something like that. + +00:51:14.160 --> 00:51:15.600 +but we did research on it. + +00:51:15.600 --> 00:51:17.300 +I mean, everything we're doing is perfectly legal + +00:51:17.750 --> 00:51:19.280 +in the book in the United States, at least. + +00:51:20.140 --> 00:51:22.560 +So, you know, but anyway. + +00:51:22.590 --> 00:51:22.900 +Scary, right? + +00:51:23.160 --> 00:51:23.980 +Yeah, absolutely. + +00:51:24.320 --> 00:51:26.760 +And we put a big disclaimer at the top of the chapter + +00:51:26.810 --> 00:51:27.460 +as you remember reading. + +00:51:28.500 --> 00:51:32.160 +I have a legal, I have a Nintendo story for you. + +00:51:32.420 --> 00:51:32.660 +Great. + +00:51:32.840 --> 00:51:33.900 +From when I was young. + +00:51:34.070 --> 00:51:38.420 +And this was, I think this is very late 80s, + +00:51:38.560 --> 00:51:40.220 +early 90s, last century. + +00:51:40.580 --> 00:51:42.720 +Now, this was not me, someone I knew, + +00:51:42.960 --> 00:51:44.980 +someone I was friends with had a, + +00:51:45.150 --> 00:51:46.380 +I think it was a super NES. + +00:51:47.040 --> 00:51:47.140 +Okay. + +00:51:47.280 --> 00:51:48.860 +And people may not remember, + +00:51:49.240 --> 00:51:51.940 +but they had these big honking cartridges go chonk. + +00:51:51.940 --> 00:51:53.560 +And you would chunk them down in there. + +00:51:53.830 --> 00:51:54.560 +You would turn on, + +00:51:54.740 --> 00:51:58.080 +you would basically reboot the NES and it would boot up from the, + +00:51:58.250 --> 00:52:00.520 +the big cartridge game cartridge you put in there. + +00:52:00.820 --> 00:52:03.980 +And there was this guy that they've, + +00:52:04.280 --> 00:52:05.820 +they found somehow I don't, + +00:52:05.960 --> 00:52:06.660 +cause there's no internet. + +00:52:06.880 --> 00:52:08.020 +Maybe it's through a BBS. + +00:52:08.090 --> 00:52:09.200 +I don't know how this was found, + +00:52:09.440 --> 00:52:12.460 +but they found this guy who built a thing that the, + +00:52:12.720 --> 00:52:19.020 +bottom half of it looked like a game cartridge. The top was this whole computer that at the top of it, + +00:52:19.220 --> 00:52:25.480 +it had a 3.5 inch floppy drive. Wow. So you would go to your BBS, you would find a game, + +00:52:25.760 --> 00:52:30.900 +you would put it on a floppy, and then you would slot this huge extra computer down into the NES + +00:52:31.140 --> 00:52:36.500 +and you would turn it on and it would like delegate the file IO through the little fake cartridge + +00:52:37.040 --> 00:52:41.320 +back to it. And it worked like crazy. It was perfect. That guy was not in business for very + +00:52:41.340 --> 00:52:46.540 +long the guy who sold them not my not my friend but the person he got it from wasn't available + +00:52:46.780 --> 00:52:53.000 +after a while so this is so super nes comes out in like 91 um so this might have been a regular + +00:52:53.440 --> 00:52:57.740 +no i wasn't i think i might have been in college so yeah it would be it would be 91 onward yeah + +00:52:58.020 --> 00:53:03.520 +okay i mean that is 93 that is very advanced like for that time when you think about the type of + +00:53:03.700 --> 00:53:08.119 +stuff people like were modding at that time that guy was a genius yeah he was he should have been + +00:53:08.100 --> 00:53:13.900 +hired if he wasn't you know what i mean yeah yeah absolutely yeah he was building himself out of his + +00:53:14.180 --> 00:53:18.860 +dorm room or something i don't know it was i think it was in uh manhattan in uh kansas state + +00:53:18.970 --> 00:53:23.640 +university i'm not sure where the guy was but wow anyway yeah that guy was playing with fire + +00:53:24.410 --> 00:53:30.920 +yeah absolutely and you know what um sometimes like understanding how the rom cartridge works + +00:53:31.140 --> 00:53:36.700 +right like we think again that it's like all magic right but actually what those big cartridges that + +00:53:36.720 --> 00:53:41.420 +we had, that we like inserted in the NES were is they were mostly plastic, like that did nothing. + +00:53:41.900 --> 00:53:46.180 +And there was a little ROM chip, like inside the big hunk of plastic and the ROM chip, + +00:53:46.380 --> 00:53:50.700 +ROM just stands for read only memory. So most of them were just like a big chunk of memory + +00:53:51.180 --> 00:53:55.320 +with a very tiny bit of logic chips that were called mappers on the NES that said like, + +00:53:55.500 --> 00:53:59.560 +read this memory at this time. But again, it wasn't really magic. And those cartridges look + +00:53:59.640 --> 00:54:04.440 +so intimidating, but they were really just a big piece of memory. Yeah. Think about how much you've + +00:54:04.460 --> 00:54:10.200 +got to make that work right before you ship it. Yeah. There's no patches. There's no download and + +00:54:10.360 --> 00:54:14.600 +update. Once it's shipped, it's shipped and it's burned into the ROM and that's it. + +00:54:14.820 --> 00:54:19.880 +I have some friends who work in game development and like before they even ship the 1.0, + +00:54:20.300 --> 00:54:24.940 +nowadays they're working on the first patch, like, and they just know they're not going to ship like + +00:54:24.950 --> 00:54:30.400 +the perfect program in 1.0. Now, to be fair, games were much simpler in the 1980s, right? + +00:54:30.560 --> 00:54:35.300 +than they are today, but they had to be perfect, like literally had to be perfect. and so + +00:54:35.400 --> 00:54:41.200 +there was a certain, I think different attitude around, game development then, than there is + +00:54:41.340 --> 00:54:46.160 +today. And remember they were also working in assembly language. so it was easy language + +00:54:46.360 --> 00:54:51.340 +to work. I don't know if people like this word, but I would call it kind of hardcore, writing + +00:54:51.600 --> 00:54:57.580 +NES games in the 1980s. It was, you know, that kind of programming it's, it's just, it was just + +00:54:57.600 --> 00:55:03.360 +different. It was just different. You had to be so detail oriented and you had to be so thorough. + +00:55:03.720 --> 00:55:08.540 +It's almost like writing spaceship control software type of thing. Not quite, but almost. + +00:55:08.940 --> 00:55:16.140 +And the machines were so slow. The NES, which we emulate in chapter six, ran on a 1.8 megahertz + +00:55:16.640 --> 00:55:25.340 +CPU, 6502, 1.8 megahertz. So Super Mario was like an incredible accomplishment on a 1.8 megahertz. + +00:55:25.580 --> 00:55:36.420 +So people had to worry about all these computer science topics in a way that they don't today as programmers, because you had to squeeze every last bit of algorithmic performance out of the machine. + +00:55:36.960 --> 00:55:42.060 +And so today, I think we have sloppier software today because we have an embarrassment of riches. + +00:55:42.250 --> 00:55:47.840 +We got such powerful computers with so much memory that people don't worry about writing things as efficiently as possible. + +00:55:48.190 --> 00:55:55.140 +And so we end up with inefficient software sometimes because people don't bother to do the algorithms right. + +00:55:55.360 --> 00:56:00.240 +Yes, but I'm going to put a little double dagger, like see the footnote by your comment. + +00:56:00.900 --> 00:56:04.700 +Absolutely true for software, not necessarily true for software developers. + +00:56:05.160 --> 00:56:05.500 +Agreed. + +00:56:05.570 --> 00:56:12.880 +The efficiency of writing a 99.9% correct Python program, how quickly that gets done versus + +00:56:13.380 --> 00:56:16.360 +like something in assembly language straight on a ROM. + +00:56:16.740 --> 00:56:16.920 +Yep. + +00:56:17.180 --> 00:56:17.580 +You know what I mean? + +00:56:17.830 --> 00:56:22.140 +These are the, I think this is an interesting arc and maybe you could speak to it a little + +00:56:22.100 --> 00:56:27.120 +bit from your academic perspective, but I think it's a really interesting arc of, it used to be + +00:56:27.200 --> 00:56:32.260 +really hard to program and we'd solve small problems with lots of effort and it's getting + +00:56:32.640 --> 00:56:37.260 +easier. And a lot of times it gets easier. People say, well, there it goes, job's over. There's not + +00:56:37.260 --> 00:56:41.700 +going to be programmers anymore because everyone can do it. And we just solve bigger, more complex + +00:56:41.920 --> 00:56:47.820 +problems. And that just keeps building, you know, like the, a web app, a basic e-commerce app from + +00:56:47.840 --> 00:56:55.260 +today was a keynote presentation of how like a fortune 500 company pulled it off in 1990 you + +00:56:55.300 --> 00:57:00.280 +know what i mean like yeah yeah here's how we got the ssl termination special machines to handle the + +00:57:00.480 --> 00:57:06.020 +ssl decryption so that we could do this under two seconds of request i mean like crazy stuff yeah + +00:57:06.140 --> 00:57:09.940 +what do you think about this well it's not an either or i mean we want developers to be productive + +00:57:10.300 --> 00:57:15.340 +and we also want efficient software so i mean the good news is that means that after we have make it + +00:57:15.120 --> 00:57:19.840 +it easier and easier to build software, we still need folks who are going to work on optimizing that + +00:57:19.980 --> 00:57:24.140 +software after it's built. So there's still jobs for software developers who are interested in that. + +00:57:24.500 --> 00:57:28.420 +I think you actually touched on this in talk Python in production to some degree, because you + +00:57:28.580 --> 00:57:33.760 +talk about in the book, the appropriate level of complexity for the type of app that you're + +00:57:33.960 --> 00:57:38.320 +building. And so you talk about like, you know, maybe you don't need a Kubernetes cluster if + +00:57:38.420 --> 00:57:43.900 +you're just building, you know, something for 10 million requests a month and not something for a + +00:57:43.920 --> 00:57:48.360 +million requests a month. And it's going to be run by one person or one person part-time versus a + +00:57:48.540 --> 00:57:55.180 +team or somebody who's a DevOps expert, right? Right, right, right. So if I'm working on a run + +00:57:55.320 --> 00:58:00.000 +of the mill e-commerce thing, maybe I don't worry about squeezing out every last bit of performance + +00:58:00.200 --> 00:58:04.220 +because I want to productively make something that, like you said, is 99.9% of the way there + +00:58:04.780 --> 00:58:10.920 +as quickly as possible. But if I'm building a 3D game, they still build most 3D games in C++ + +00:58:10.940 --> 00:58:14.060 +because they need to squeeze out every last bit of performance, + +00:58:14.670 --> 00:58:17.240 +even if it's a little less efficient for the programmers. + +00:58:17.310 --> 00:58:18.820 +The programmers could write the games faster, + +00:58:18.970 --> 00:58:21.120 +maybe in another language or another framework + +00:58:21.300 --> 00:58:21.860 +than they're using, whatever. + +00:58:22.320 --> 00:58:24.400 +But they use something that squeezes out + +00:58:24.660 --> 00:58:25.460 +every last bit of performance. + +00:58:25.820 --> 00:58:26.900 +So I think you have to think about + +00:58:26.990 --> 00:58:28.260 +what is the app that we're building? + +00:58:28.480 --> 00:58:29.820 +What is the domain that we're in? + +00:58:30.110 --> 00:58:31.920 +And it's also just not an either or. + +00:58:32.200 --> 00:58:34.060 +Like, you know, we want to have software + +00:58:34.220 --> 00:58:36.760 +that's both developer productive to build, + +00:58:37.220 --> 00:58:38.680 +but also that's efficient to run. + +00:58:39.200 --> 00:58:40.480 +And it's an ongoing process. + +00:58:40.840 --> 00:58:44.240 +We might need to make that trade-off more in the way when we're first building it in developer + +00:58:44.500 --> 00:58:48.460 +productivity. And then as we're maintaining it, maybe we go more towards the efficiency side. + +00:58:48.680 --> 00:58:53.380 +But if you don't think about some of the efficiency things up front, you can end up in a bad spot + +00:58:53.800 --> 00:58:58.780 +because you can get some technical debt. You could end up in a situation where it's hard to undo + +00:58:59.400 --> 00:59:03.140 +some of the algorithmic mistakes that you made early on and the system starts to depend on. + +00:59:03.380 --> 00:59:04.820 +So you should think about it at least a little bit. + +00:59:05.180 --> 00:59:09.020 +Yeah, absolutely. And I think you're talking about they've got to squeeze out that little + +00:59:09.040 --> 00:59:12.820 +extra bit of performance because they want high frame rates they might have to squeeze out that + +00:59:12.890 --> 00:59:17.580 +little bit extra performance because it's either possible or not possible you know you look at some + +00:59:17.580 --> 00:59:23.000 +of the stuff in the unreal engine these days you're like that's real time that's the game + +00:59:23.180 --> 00:59:30.520 +that looks like a rendered cgi movie not a game yeah yeah yeah yeah yeah and i mean so a lot of it + +00:59:30.560 --> 00:59:34.820 +is being done for us today by library authors right so library authors are thinking about a + +00:59:34.820 --> 00:59:39.600 +lot of the low-level stuff. So us as like run-of-the-mill developers don't need to, which is + +00:59:39.730 --> 00:59:44.640 +great. But if you're doing something really de novo, something, you know, with an innovative + +00:59:44.980 --> 00:59:50.180 +algorithm, you still need to consider efficiency at least. Right. And the other thing is it, + +00:59:50.400 --> 00:59:55.180 +you can write really slow code in C and you can write really fast code in Python or vice versa. + +00:59:55.420 --> 01:00:00.420 +It's easier to make it slow in Python, to be fair. But algorithms and data structures are + +01:00:01.500 --> 01:00:05.720 +tremendous influences, right? If you're using, you should have been using a set or a dictionary and + +01:00:05.820 --> 01:00:10.580 +you're using a list. Yeah. You're probably having a real bad time performance. And you caught me + +01:00:10.780 --> 01:00:15.120 +multiple times on the technical review in places where I should have used a set and I used a list. + +01:00:15.140 --> 01:00:19.200 +So thank you for that. But, but that's absolutely, that's a great example. I mean, + +01:00:19.400 --> 01:00:24.960 +just knowing that, that basic fact that, you know, in this situation, just swapping out, + +01:00:25.140 --> 01:00:29.800 +which is a simple switch, by the way, you know, one data structure for another can totally change + +01:00:29.820 --> 01:00:34.120 +performance characteristics. That's the type of things you do learn in the CS degree that you + +01:00:34.320 --> 01:00:37.920 +don't necessarily learn when you're a self-taught programmer kind of hacking everything out on your + +01:00:38.040 --> 01:00:41.940 +own, which is why a book like classic computer science problems in Python, or maybe computer + +01:00:42.040 --> 01:00:47.260 +science from scratch is good for that. Yeah. It exposes you to this. Here's a data structure you + +01:00:47.440 --> 01:00:51.600 +haven't thought about, but here's why we're using it and the advantages and stuff. Yeah, absolutely. + +01:00:51.940 --> 01:00:57.220 +Well, we've covered a lot. We've covered pretty much solved computer science in 2025. That's good. + +01:00:57.300 --> 01:01:04.300 +And then the book is bringing computer science to so many people who are self-taught, you know, like me. + +01:01:04.580 --> 01:01:09.100 +I went to college, but I studied many, many years of math and found my way into programming. + +01:01:09.400 --> 01:01:11.860 +So I took a few CS programs, but not so many. + +01:01:12.060 --> 01:01:15.400 +So, you know, it certainly showed me many things that I hadn't seen. + +01:01:15.420 --> 01:01:18.200 +Like I've never emulated hardware before, for example. + +01:01:18.380 --> 01:01:19.260 +That was wild. + +01:01:19.960 --> 01:01:26.060 +And, you know, Michael, I was reading in Talk Python in production that you actually worked on Windows desktop apps. + +01:01:26.240 --> 01:01:26.660 +Is that right? + +01:01:26.660 --> 01:01:26.860 +I did. + +01:01:27.220 --> 01:01:29.640 +Yeah, for 10 years, more than 10 years. + +01:01:29.720 --> 01:01:30.800 +I'm almost 15 years, yeah. + +01:01:31.120 --> 01:01:32.760 +Wow, so I'm curious, + +01:01:33.400 --> 01:01:35.800 +how did you feel about that experience + +01:01:36.420 --> 01:01:38.800 +versus the kind of Python GUI frameworks + +01:01:39.160 --> 01:01:39.760 +that exist today? + +01:01:40.800 --> 01:01:42.280 +You heard my rant about Visual Basic. + +01:01:42.470 --> 01:01:43.960 +Yeah, and I was thinking of that. + +01:01:44.600 --> 01:01:46.120 +I started actually in C++, + +01:01:46.800 --> 01:01:49.120 +starting the hard way, doing MFC. + +01:01:49.330 --> 01:01:51.180 +I don't know if you ever played with that or heard of that. + +01:01:51.330 --> 01:01:52.180 +Yeah, Microsoft Foundation classes. + +01:01:52.490 --> 01:01:53.780 +Yeah, yeah, and that was actually pretty decent + +01:01:54.000 --> 01:01:55.180 +for a C++ framework. + +01:01:55.220 --> 01:01:59.100 +And then as soon as I found Visual Basic and C#, I'm like, this is so much better. + +01:01:59.640 --> 01:02:03.680 +It goes from weeks to days of UI work, you know, and stuff like that. + +01:02:03.720 --> 01:02:07.940 +And it took me a while to really appreciate building for the web. + +01:02:08.520 --> 01:02:10.460 +You know, I think I probably made that switch around the year 2000. + +01:02:11.080 --> 01:02:14.020 +There's a little bit after that, but I really like the web these days. + +01:02:14.120 --> 01:02:15.160 +I think the web is special. + +01:02:15.560 --> 01:02:20.540 +I just wish it was easier to take apps from the web and get them to people. + +01:02:20.820 --> 01:02:24.060 +For example, Firefox canceled progressive web apps. + +01:02:24.480 --> 01:02:27.560 +iOS has them, but they're kind of, let's not talk about those. + +01:02:27.750 --> 01:02:30.400 +And if you know the secret, you can probably install one, but you probably shouldn't. + +01:02:30.470 --> 01:02:35.640 +And we're going to, you know, like if it's just, it's like right on the verge of one more + +01:02:35.880 --> 01:02:39.680 +step and the web would be really, a really good replacement for those. + +01:02:39.920 --> 01:02:43.740 +And now we have things like Electron and so on, which I'm not a huge fan of, but it's, + +01:02:44.040 --> 01:02:48.620 +it's kind of what we need to make that happen, but we don't necessarily need Electron, right? + +01:02:48.680 --> 01:02:49.400 +It would be really great. + +01:02:49.580 --> 01:02:50.820 +I'm also really excited. + +01:02:50.890 --> 01:02:53.740 +I don't know how you feel about, I'm really excited about PyScript. + +01:02:54.320 --> 01:02:56.100 +and the possibility of running Python + +01:02:56.560 --> 01:02:57.580 +for your front-end stuff? + +01:02:57.820 --> 01:03:00.080 +Well, I guess what I was getting at is + +01:03:00.400 --> 01:03:02.940 +why do you think Python has not taken off more + +01:03:03.360 --> 01:03:04.680 +for desktop GUI development? + +01:03:04.900 --> 01:03:06.520 +So like we've seen things like KIVI, + +01:03:06.680 --> 01:03:09.980 +of course we have PyQT and several other frameworks + +01:03:10.160 --> 01:03:13.060 +that wrap into older C++ GUI frameworks. + +01:03:13.640 --> 01:03:16.360 +But like Python now is so mature, + +01:03:16.410 --> 01:03:17.820 +it's the most popular language in the world. + +01:03:18.180 --> 01:03:19.480 +There's nothing preventing it. + +01:03:19.680 --> 01:03:20.820 +Yeah, there's nothing preventing it + +01:03:20.980 --> 01:03:22.540 +from making a really nice platform + +01:03:22.560 --> 01:03:25.880 +for native app development, right? + +01:03:26.380 --> 01:03:27.760 +Fundamentally, you could wrap, + +01:03:28.100 --> 01:03:29.100 +you could come up with a framework + +01:03:29.320 --> 01:03:31.480 +that abstracts talking to Objective-C, + +01:03:32.080 --> 01:03:34.460 +the APIs there, or the Win32 API. + +01:03:34.960 --> 01:03:36.700 +You know, one of the things I'm starting to realize + +01:03:36.980 --> 01:03:39.180 +that makes desktop development really different + +01:03:39.440 --> 01:03:41.440 +from back in the day, if you will, + +01:03:41.700 --> 01:03:44.500 +is there are so many gatekeepers and barriers + +01:03:44.650 --> 01:03:47.020 +to getting your app onto a machine, right? + +01:03:47.140 --> 01:03:50.420 +If I built a cool desktop app and I gave it to you, + +01:03:50.760 --> 01:03:52.100 +your Mac would go, no, + +01:03:52.320 --> 01:03:56.520 +we're not running that. We moved it to the trash for you because it wasn't notarized. + +01:03:56.780 --> 01:04:02.700 +Right. And something similar happens on Windows. And so there's these steps you got to jump through. + +01:04:03.100 --> 01:04:10.480 +And I think there's just been too many gotchas and steps for anybody to push a framework or a way of + +01:04:10.600 --> 01:04:15.920 +doing this all the way through. I mean, in a sense, the web is kind of like good enough that we don't + +01:04:16.120 --> 01:04:21.280 +have to figure out a way to build this. I think honestly, if I were to be more concerned, I'd be + +01:04:21.180 --> 01:04:26.400 +more concerned that we can't create truly straightforward mobile apps with Python than + +01:04:26.580 --> 01:04:30.540 +desktop apps. Yeah. What do you think? That's where I was going to go to next. Yeah. And I mean, + +01:04:30.720 --> 01:04:36.740 +Kivy has been an attempt at that. But, you know, I don't really think it's gotten a ton of track + +01:04:36.860 --> 01:04:41.080 +from what I see. It doesn't look like it's gotten a ton of traction in the way people in the Python + +01:04:41.340 --> 01:04:47.200 +community hoped it would. So, you know, I don't have to do good work, but it's also not there yet. + +01:04:47.340 --> 01:04:53.920 +I don't think. I wonder if some of the performance issues, you know, are part of this. So, you know, + +01:04:54.140 --> 01:04:59.580 +people expect one of the reasons we like desktop apps and native mobile apps over web apps is + +01:04:59.700 --> 01:05:04.660 +because we get instantaneous feedback and, you know, really high performance in our user interfaces. + +01:05:05.180 --> 01:05:12.260 +And, you know, I still find even a PyQT app sometimes a little bit slower than, you know, + +01:05:12.340 --> 01:05:19.560 +a regular QT app. And, you know, it's unfortunate. For me, like the big want for the whole Python + +01:05:19.840 --> 01:05:24.180 +ecosystem is what started to be the focus, I think, of the core developers, which is performance + +01:05:24.440 --> 01:05:28.600 +improvements. Like, I think that performance improvements would make everyone's lives in the + +01:05:28.800 --> 01:05:35.381 +ecosystem so much better, you know. And so I think rightly that this has become like one of the central + +01:05:35.400 --> 01:05:41.960 +focuses of, you know, of the core developers. Yeah. Guido, Mark Shannon, Grant Booker, + +01:05:42.300 --> 01:05:45.520 +a bunch of people whose names I'm not including, they've all done really good work over the last + +01:05:45.620 --> 01:05:50.220 +three or four years. I was a little bit sad to see Microsoft cancel that project or cancel the + +01:05:50.360 --> 01:05:54.980 +funding for that project. I mean, the project continues, but still, you know, what has happened + +01:05:55.080 --> 01:05:59.540 +is so, so much better. That is really a big deal. Yeah. Yeah. And, you know, I even saw it in the + +01:05:59.630 --> 01:06:05.360 +course of writing the book. I started writing the book in 2021 and that NES emulator on the same + +01:06:05.380 --> 01:06:09.380 +was something like 12 frames per second in 2021. + +01:06:10.050 --> 01:06:12.560 +And by 2025, it was like 17 frames + +01:06:12.590 --> 01:06:13.620 +or 15 or 17 frames per second. + +01:06:13.620 --> 01:06:13.900 +Yeah, nice. + +01:06:15.280 --> 01:06:17.880 +I really saw it in these computationally intensive + +01:06:18.660 --> 01:06:19.360 +programs in the book. + +01:06:19.640 --> 01:06:20.860 +I'm pretty positive for it. + +01:06:20.860 --> 01:06:23.920 +I mean, there's a lot of things that could be better, + +01:06:24.100 --> 01:06:26.700 +but I think one of the real superpowers is + +01:06:27.080 --> 01:06:29.520 +it's approachable, but it's ceiling of Python. + +01:06:29.720 --> 01:06:31.640 +That is, it's ceiling of what you can accomplish + +01:06:31.840 --> 01:06:33.100 +is not that low, right? + +01:06:33.180 --> 01:06:36.860 +You can go pretty far if you have CS skills and ideas. + +01:06:37.160 --> 01:06:39.600 +And then, you know, pip install, uv install. + +01:06:40.240 --> 01:06:44.240 +The options of what is out there to just build and click together are incredible. + +01:06:44.600 --> 01:06:47.680 +What do you think about, and I know you talked about it on the show before. + +01:06:47.770 --> 01:06:53.560 +I heard about a while ago on the show about Mojo and about, you know, a total attempt + +01:06:53.860 --> 01:07:00.160 +that, you know, let's just redo it and we'll get to keep the language syntax, but not the + +01:07:00.300 --> 01:07:00.460 +runtime. + +01:07:01.120 --> 01:07:01.680 +Right, right, right. + +01:07:01.900 --> 01:07:05.080 +And I mean, PyPy is also, of course, kind of an attempt at that as well. + +01:07:05.440 --> 01:07:07.140 +But and some people call for it, right? + +01:07:07.150 --> 01:07:10.620 +I see sometimes people are like, why don't they replace CPython with PyPy? + +01:07:10.960 --> 01:07:14.820 +What do you think about kind of just like the whole is too big a topic for the end? + +01:07:14.820 --> 01:07:15.820 +No, no, it's interesting. + +01:07:16.060 --> 01:07:18.060 +I think all of those are interesting. + +01:07:18.370 --> 01:07:21.580 +I think the Mojo performance story is very powerful. + +01:07:22.080 --> 01:07:27.940 +It's also really hard to bring over to CPython because there's so many different ways that it's used. + +01:07:28.140 --> 01:07:34.200 +There's so many, you know, it runs on this piece of hardware doing this thing that we just could never optimize for, you know? + +01:07:34.620 --> 01:07:43.280 +So, and then, like I said, with 600,000 or whatever there are packages, you know, how much of that are you willing to carve away to get a faster language? + +01:07:43.800 --> 01:07:51.380 +And what I think also is a really interesting aspect that people might not think about or take into account that often is you'll see a lot of these benchmarks. + +01:07:51.650 --> 01:07:56.360 +Like here's the three body, solving the three body problem in Python, and here's solving it in Mojo. + +01:07:56.520 --> 01:07:58.680 +Here's solving the three-body problem in Rust. + +01:07:58.760 --> 01:07:59.800 +And look at that huge difference. + +01:08:00.280 --> 01:08:01.960 +But what often happens in Python + +01:08:02.620 --> 01:08:04.900 +is you find yourself orchestrating native code anyway. + +01:08:05.160 --> 01:08:07.440 +Like, okay, we're going to use Polars + +01:08:07.620 --> 01:08:08.500 +and we're going to do this thing. + +01:08:08.720 --> 01:08:10.180 +But when I call the Polars functions, + +01:08:10.460 --> 01:08:12.080 +I'm no longer running Python. + +01:08:12.340 --> 01:08:15.180 +I'm running like a drop of Python and a bunch of Rust. + +01:08:15.700 --> 01:08:16.799 +And then you write back in the same, + +01:08:16.980 --> 01:08:19.040 +or I'm talking to a database layer, + +01:08:20.020 --> 01:08:23.859 +or my web app is running on a Rust-based server. + +01:08:23.980 --> 01:08:25.520 +There's just all these little parts + +01:08:25.520 --> 01:08:28.480 +where a lot of times your code speed + +01:08:29.000 --> 01:08:30.920 +is more about how you're putting the pieces together. + +01:08:31.400 --> 01:08:31.540 +Yeah. + +01:08:31.779 --> 01:08:32.220 +Not always. + +01:08:32.319 --> 01:08:34.040 +If you're doing computational stuff, + +01:08:34.520 --> 01:08:35.440 +that's out the window potentially. + +01:08:35.640 --> 01:08:36.980 +But when you're kind of, you know, + +01:08:36.980 --> 01:08:39.380 +you do machine learning, you're doing web apps, + +01:08:39.520 --> 01:08:40.359 +you're doing database calls. + +01:08:40.520 --> 01:08:42.200 +A lot of these are like just a layer + +01:08:42.720 --> 01:08:43.339 +and then off it goes. + +01:08:44.420 --> 01:08:45.640 +Yeah, that's totally makes sense. + +01:08:45.859 --> 01:08:48.160 +But then I think it is holding Python back + +01:08:48.500 --> 01:08:50.140 +from some of those, you know, + +01:08:50.200 --> 01:08:51.819 +some of these real interesting domains + +01:08:52.319 --> 01:08:53.859 +that people want to get into, + +01:08:54.500 --> 01:08:56.500 +especially when they're first learning programming, + +01:08:56.720 --> 01:08:57.540 +like 3D games. + +01:09:02.200 --> 01:09:03.060 +Let's stick with that + +01:09:03.180 --> 01:09:04.540 +because I'm thinking of ones for people + +01:09:04.759 --> 01:09:05.799 +in computer science education. + +01:09:06.520 --> 01:09:08.700 +A lot of people who study computer science + +01:09:08.799 --> 01:09:10.259 +is because they want to make a game, right? + +01:09:10.359 --> 01:09:10.420 +Sure. + +01:09:10.819 --> 01:09:12.040 +And when they want to make a game- + +01:09:12.040 --> 01:09:12.839 +And I can see a world, + +01:09:12.839 --> 01:09:14.720 +like look at one of the biggest game companies + +01:09:14.720 --> 01:09:16.120 +in the world, it's Unity. + +01:09:16.359 --> 01:09:16.460 +Yeah. + +01:09:16.900 --> 01:09:17.279 +Right? + +01:09:17.279 --> 01:09:18.299 +As a foundational, + +01:09:18.299 --> 01:09:21.120 +like building your game with not creator of games. + +01:09:21.120 --> 01:09:22.960 +They're built, I believe in C-sharp and.net, + +01:09:22.960 --> 01:09:23.960 +if I remember correctly. + +01:09:24.220 --> 01:09:28.040 +And that's a faster language, but it's not raging fast. + +01:09:28.270 --> 01:09:33.299 +You know, it's a lot faster, but it's still a decent way from a C, + +01:09:33.750 --> 01:09:35.420 +like a pure C language, right? + +01:09:35.660 --> 01:09:35.759 +Yeah. + +01:09:35.900 --> 01:09:37.020 +A pure C implementation. + +01:09:37.640 --> 01:09:39.880 +And they're really, really successful. + +01:09:39.950 --> 01:09:41.100 +They got close enough. + +01:09:41.560 --> 01:09:45.520 +I could easily see some company go, we're going to build a game engine. + +01:09:45.970 --> 01:09:48.940 +We're going to use Python as the language to get as many people + +01:09:48.990 --> 01:09:52.380 +who have been left out in the cold, in a sense, to do it. + +01:09:52.540 --> 01:09:54.320 +but we're going to do a mojo-like thing. + +01:09:54.390 --> 01:09:55.520 +Or we're going to do something where + +01:09:55.790 --> 01:09:57.120 +you don't get to use every library, + +01:09:57.280 --> 01:09:59.560 +but do you really need Flask in your game? + +01:10:00.100 --> 01:10:00.880 +Not in your game. + +01:10:01.320 --> 01:10:01.680 +You know what I mean? + +01:10:01.800 --> 01:10:02.220 +Yeah, yeah, yeah. + +01:10:02.400 --> 01:10:05.100 +We're going to build like a smaller focused, + +01:10:05.400 --> 01:10:06.320 +high performance version + +01:10:06.720 --> 01:10:08.900 +that looks as close as it could be. + +01:10:09.200 --> 01:10:10.780 +And we're going to sell you a game engine + +01:10:11.200 --> 01:10:13.000 +in a way to ship those games on Steam, + +01:10:13.490 --> 01:10:16.480 +a way to ship those games on Metal to macOS, et cetera. + +01:10:16.680 --> 01:10:16.740 +Right? + +01:10:16.830 --> 01:10:18.200 +Like I could see that world happen. + +01:10:18.620 --> 01:10:19.040 +Yeah, yeah. + +01:10:19.090 --> 01:10:20.380 +I don't see me making that world, + +01:10:20.430 --> 01:10:21.240 +but I could see that happen. + +01:10:21.420 --> 01:10:22.800 +And then actually, I think it would be okay. + +01:10:23.410 --> 01:10:23.900 +How do you feel? + +01:10:24.040 --> 01:10:25.260 +Like, do you see that as possible? + +01:10:25.540 --> 01:10:26.620 +No, that makes total sense. + +01:10:26.870 --> 01:10:29.420 +I think another thing we think about is like, + +01:10:29.640 --> 01:10:31.920 +how much are skills becoming more transferable + +01:10:32.110 --> 01:10:33.120 +because of LLMs? + +01:10:33.500 --> 01:10:36.040 +So can somebody who already learned Python well + +01:10:36.580 --> 01:10:38.960 +now quickly pick up C# in Unity + +01:10:39.440 --> 01:10:43.200 +because the LLM is doing a lot of the detailed syntax for them + +01:10:43.620 --> 01:10:45.620 +and they just have to understand the programmatic ideas. + +01:10:46.180 --> 01:10:48.840 +So is like, you know, maybe Python will continue + +01:10:48.860 --> 01:10:51.520 +to always be this great first language for everybody. + +01:10:52.030 --> 01:10:54.220 +And it'll be easier for people to now transition + +01:10:54.440 --> 01:10:56.500 +to other languages for specific domains. + +01:10:57.140 --> 01:10:59.020 +And so it matters less that we have everything + +01:10:59.800 --> 01:11:01.300 +in the original language. + +01:11:01.800 --> 01:11:03.840 +Okay, out in the chat, + +01:11:04.860 --> 01:11:07.220 +there's the recommendation for, that's GoDot. + +01:11:07.460 --> 01:11:09.620 +If you want to go, I'm not super familiar with it. + +01:11:09.800 --> 01:11:11.720 +I think it's an open source game engine. + +01:11:12.260 --> 01:11:13.300 +Yes, that much I know, + +01:11:13.380 --> 01:11:14.720 +but that's where my knowledge stops. + +01:11:15.220 --> 01:11:16.600 +Right, and I think it has like, + +01:11:16.800 --> 01:11:21.580 +it's like a Godot script or something or G script or something is it's it's language which I think + +01:11:21.580 --> 01:11:26.720 +has more of a Python like syntax somebody in the chat can correct me yeah interesting okay but I + +01:11:26.880 --> 01:11:31.380 +I think if you took an end-to-end thing you know it's not just a matter of like you can run the + +01:11:31.540 --> 01:11:36.520 +game engine with this you've got to take it all the way to here's how you ship your games because + +01:11:37.040 --> 01:11:40.920 +soon what is the very first thing you want to do once you get your game working and fun you want + +01:11:40.920 --> 01:11:45.940 +to show your friends you know what I mean and you've got to find a way to send it out so this + +01:11:45.940 --> 01:11:50.220 +goes back to what you were talking about earlier is all the hurdles around you know getting something + +01:11:50.220 --> 01:11:56.440 +on the mac app store steam or you know or whatever um and how and that goes also to another big thing + +01:11:56.490 --> 01:12:01.680 +that we're trying to solve in the python ecosystem which is the packaging story right um which there + +01:12:01.680 --> 01:12:06.660 +are many solutions for but just not one decided on let's make this the standard like you know as + +01:12:06.670 --> 01:12:10.840 +easy as possible thing yeah it's getting better things are definitely getting better but it's still + +01:12:11.100 --> 01:12:15.800 +there is no Python build --format equals EXE. + +01:12:16.060 --> 01:12:16.120 +Right. + +01:12:16.320 --> 01:12:20.820 +You know, where, where, what comes out and then some sort of tooling that automatically + +01:12:21.680 --> 01:12:25.040 +signs that stuff with your certificate you've got as a windows developer. + +01:12:25.200 --> 01:12:26.900 +So it doesn't get flags as malware. + +01:12:27.580 --> 01:12:31.220 +There's just, even if you get the thing to build, there's like these three or four other + +01:12:31.480 --> 01:12:33.500 +steps, you know, I'm thinking about an iOS. + +01:12:33.780 --> 01:12:37.960 +It's like, okay, we got it to run Python, but, but really what we needed to do is like + +01:12:37.980 --> 01:12:43.180 +integrate with Swift UI and have storyboards where I can like, I can weave it. And you're like, + +01:12:43.280 --> 01:12:47.720 +well, that's no, that's a long ways away. Like I know, but that's, you got to go through those + +01:12:48.080 --> 01:12:52.120 +stages to get it. I don't know. It's just, there's, there's a little bit, a little bit further to go, + +01:12:52.200 --> 01:12:55.100 +I guess, but I would love to see it. And I'm, I think it'll happen probably. + +01:12:55.360 --> 01:12:57.920 +It's not that you can't do it. It's just, there's too much friction right now. + +01:12:58.260 --> 01:13:02.600 +Yeah. Yeah. Yeah. Well, hopefully you've given some people some ideas. Somebody's going to go + +01:13:02.620 --> 01:13:09.180 +start the unity the pie unity company or whatever and let's let's let's make it happen + +01:13:09.660 --> 01:13:16.080 +yeah yeah oh and also just um real-time follow-up out there godo is apparently the pronunciation + +01:13:16.540 --> 01:13:20.760 +it's french sorry okay no no i didn't know it either um i think honestly just shout out to + +01:13:20.860 --> 01:13:25.580 +everyone if you have a weird name for your project on github put an mp3 and say this is how you say + +01:13:25.580 --> 01:13:31.680 +it just get it let's let's help us out and not that french is weird but just you know we should + +01:13:31.700 --> 01:13:36.340 +actually do that with english projects too right um yeah yeah absolutely to make them accessible to + +01:13:36.460 --> 01:13:41.260 +the yeah as simple yeah it has nothing to do with french i'm thinking even as something simple as + +01:13:41.700 --> 01:13:46.880 +g unicorn is often said gunicorn and half the people out there are probably thinking michael + +01:13:47.000 --> 01:13:53.300 +you're wrong it's gunicorn not unicorn but their logo is a green unicorn yeah i'm like well it's + +01:13:53.460 --> 01:13:57.860 +probably the g stand just g and then certainly the unicorn part is probably unicorn because + +01:13:57.880 --> 01:14:02.880 +unicorn is you know what i mean but like it's totally reasonable to look at it and go unicorn + +01:14:03.320 --> 01:14:08.080 +you know what i mean but if they just put a little mp3 like it's or even just where it's pronounced + +01:14:08.460 --> 01:14:14.260 +g ee dash unicorn you know what i mean yeah yeah i think we all need to do that yeah yeah all right + +01:14:14.400 --> 01:14:19.000 +well the chat is getting lively but we're gonna have to call it because we might be just a tiny + +01:14:19.040 --> 01:14:23.800 +bit over time and it's getting late and you're part of the world so well thank you so much for + +01:14:23.820 --> 01:14:28.040 +having me on again, Michael. It was really a pleasure. Congratulations again on Talk Python + +01:14:28.130 --> 01:14:31.800 +in production. Thank you. And Computer Science from Scratch as well. + +01:14:32.240 --> 01:14:35.860 +Yeah. And if folks want to check out the book, there's a website that + +01:14:36.440 --> 01:14:39.480 +I'm sure you'll put in the show notes, computersciencefromscratch.com. + +01:14:40.160 --> 01:14:44.080 +And you can just learn more about the different projects we do and some of the + +01:14:45.620 --> 01:14:48.280 +ideas in the book. Yeah. Awesome. People want to get in touch with you + +01:14:48.700 --> 01:14:52.760 +otherwise? How do they do that? Sure. You can find me on X, I guess, + +01:14:52.820 --> 01:14:55.400 +at Dave Kopech, D-A-V-E-K-O-P-E-C. + +01:14:55.630 --> 01:14:58.460 +And if you go to my website, DaveKopech.com, + +01:14:58.580 --> 01:14:59.360 +there's a bunch, there's email + +01:14:59.630 --> 01:15:00.960 +and a bunch of other ways to contact me. + +01:15:01.130 --> 01:15:03.200 +So D-A-V-E-K-O-P-E-C.com. + +01:15:03.460 --> 01:15:04.260 +Yeah, I'll put it in the show notes. + +01:15:04.740 --> 01:15:05.140 +Thanks, Michael. + +01:15:05.520 --> 01:15:06.680 +Yeah, you bet, you bet. + +01:15:06.920 --> 01:15:08.680 +So David, thanks for being back on the show. + +01:15:09.020 --> 01:15:09.560 +It's been really fun. + +01:15:09.840 --> 01:15:10.440 +Yeah, we'll talk to you later. + +01:15:10.760 --> 01:15:11.520 +Awesome, thanks, Michael. + +01:15:11.880 --> 01:15:12.140 +See ya. + +01:15:13.260 --> 01:15:15.400 +This has been another episode of Talk Python To Me. + +01:15:15.660 --> 01:15:16.480 +Thank you to our sponsors. + +01:15:16.750 --> 01:15:18.020 +Be sure to check out what they're offering. + +01:15:18.230 --> 01:15:19.560 +It really helps support the show. + +01:15:20.320 --> 01:15:21.980 +This episode is brought to you by Sentry. + +01:15:22.240 --> 01:15:23.560 +Don't let those errors go unnoticed. + +01:15:23.710 --> 01:15:25.340 +Use Sentry like we do here at Talk Python. + +01:15:25.880 --> 01:15:28.700 +Sign up at talkpython.fm/sentry. + +01:15:29.600 --> 01:15:31.580 +And it's brought to you by NordStellar. + +01:15:32.100 --> 01:15:34.620 +NordStellar is a threat exposure management platform + +01:15:35.200 --> 01:15:36.440 +from the Nord security family, + +01:15:36.730 --> 01:15:37.880 +the folks behind NordVPN, + +01:15:38.460 --> 01:15:40.040 +that combines dark web intelligence, + +01:15:40.680 --> 01:15:42.200 +session hijacking prevention, + +01:15:42.800 --> 01:15:44.800 +brand and domain abuse detection, + +01:15:45.180 --> 01:15:46.880 +and external attack surface management. + +01:15:47.480 --> 01:15:49.600 +Learn more and get started keeping your team safe + +01:15:49.620 --> 01:15:52.400 +at talkpython.fm/nordstellar. + +01:15:53.240 --> 01:15:55.080 +If you or your team needs to learn Python, + +01:15:55.270 --> 01:15:58.700 +we have over 270 hours of beginner and advanced courses + +01:15:58.960 --> 01:16:01.160 +on topics ranging from complete beginners + +01:16:01.470 --> 01:16:05.340 +to async code, Flask, Django, HTML, and even LLMs. + +01:16:05.560 --> 01:16:08.020 +Best of all, there's no subscription in sight. + +01:16:08.400 --> 01:16:10.180 +Browse the catalog at talkpython.fm. + +01:16:10.920 --> 01:16:12.820 +And if you're not already subscribed to the show + +01:16:13.040 --> 01:16:14.240 +on your favorite podcast player, + +01:16:14.870 --> 01:16:15.540 +what are you waiting for? + +01:16:16.160 --> 01:16:17.960 +Just search for Python in your podcast player. + +01:16:18.120 --> 01:16:18.940 +We should be right at the top. + +01:16:19.360 --> 01:16:22.140 +If you enjoyed that geeky rap song, you can download the full track. + +01:16:22.360 --> 01:16:24.240 +The link is actually in your podcast blog or share notes. + +01:16:25.080 --> 01:16:26.420 +This is your host, Michael Kennedy. + +01:16:26.820 --> 01:16:27.880 +Thank you so much for listening. + +01:16:28.080 --> 01:16:28.880 +I really appreciate it. + +01:16:29.320 --> 01:16:29.980 +I'll see you next time. + +01:16:42.840 --> 01:16:43.980 +And we ready to roll + +01:16:45.560 --> 01:16:46.760 +Upgrading the code + +01:16:47.480 --> 01:16:49.260 +No fear of getting old + +01:16:50.660 --> 01:16:52.760 +We tapped into that modern vibe + +01:16:53.200 --> 01:16:54.140 +Overcame each storm + +01:16:54.920 --> 01:16:56.260 +Talk Python To Me + +01:16:56.340 --> 01:16:57.640 +I sync is the norm + diff --git a/transcripts/530-anywidget.txt b/transcripts/530-anywidget.txt new file mode 100644 index 0000000..07fdd1a --- /dev/null +++ b/transcripts/530-anywidget.txt @@ -0,0 +1,2152 @@ +00:00:00 For years, building interactive widgets in Python notebooks meant wrestling with toolchains, platform quirks, and a mountain of JavaScript machinery. + +00:00:07 Most developers took one look and backed away slowly. + +00:00:10 Trevor Mance decided that barrier did not need to exist. + +00:00:13 His idea was simple. + +00:00:15 Give Python users just enough JavaScript to unlock the web's interactivity without dragging along the rest of the web ecosystem. + +00:00:22 That idea became anywidget, and it's quickly becoming the quiet, connective tissue of modern interactive computing. + +00:00:29 Today, we dig into how it works, why it's taken off, and how it might change the way we explore data. + +00:00:35 This is Talk Python To Me, episode 530, recorded November 25th, 2025. + +00:00:57 Welcome to Talk Python To Me, the number one Python podcast for developers and data scientists. + +00:01:03 This is your host, Michael Kennedy. + +00:01:04 I'm a PSF fellow who's been coding for over 25 years. + +00:01:08 Let's connect on social media. + +00:01:10 You'll find me and Talk Python on Mastodon, Bluesky, and X. + +00:01:13 The social links are all in your show notes. + +00:01:16 You can find over 10 years of past episodes at talkpython.fm. + +00:01:20 And if you want to be part of the show, you can join our recording live streams. + +00:01:23 That's right. + +00:01:24 We live stream the raw uncut version of each episode on YouTube. + +00:01:27 Just visit talkpython.fm/youtube to see the schedule of upcoming events. + +00:01:32 Be sure to subscribe there and press the bell so you'll get notified anytime we're recording. + +00:01:37 Look into the future and see bugs before they make it to production. + +00:01:40 Sentry's Seer AI Code Review uses historical error and performance information at Sentry + +00:01:46 to find and flag bugs in your PRs before you even start to review them. + +00:01:51 Stop bugs before they enter your code base. + +00:01:53 Get started at talkpython.fm/seer-code-review. + +00:01:57 And this episode is sponsored by JetBrains and the PyCharm team. + +00:02:01 This week only through December 19th, 2025, get 30% off of PyCharm Pro, including renewals, + +00:02:08 and PyCharm will donate all the proceeds to the Python Software Foundation. + +00:02:12 Support the PSF by getting or renewing PyCharm. + +00:02:15 Visit talkpython.fm/pycharm dash PSF dash 2025 and use the code STRONGERPYTHON. + +00:02:22 Both of these are in your podcast player show notes. + +00:02:25 Trevor, welcome to Talk By Than Me. Awesome to have you here. Thanks for coming. + +00:02:29 Yeah, thanks for having me, Michael. + +00:02:30 We're going to talk about building Lego blocks for data scientists. Is that maybe a good way to put it? + +00:02:36 Yeah, definitely. + +00:02:37 Yeah, so we're going to talk about your project, AnyWidget, which is a super cool project that allows people to build more adaptable, more reusable widgets, + +00:02:48 rather than just saying I'm going to build it for Jupyter or Remo or whatever. + +00:02:52 but I'm going to build something that can be used broadly through your platform and a lot simpler + +00:02:57 deployment as well. So I know a lot of people out there are excited. Not everyone necessarily wants + +00:03:02 to build widgets, but this also makes more widgets available for people, right? So super cool project. + +00:03:07 Yeah. And before we get into that, of course, maybe just tell us a bit about yourself. + +00:03:12 Yeah. So I'm Trevor. I'm a software engineer at Marimo, but prior to being at Marimo, I was + +00:03:18 doing my PhD in data visualization at Harvard Medical School. So I got into this whole like + +00:03:25 Python web world from working on tools for biological scientists to explore like large + +00:03:32 genomics or spatial genomic data sets. That is a super interesting topic. I'm sure + +00:03:38 there were some really niche things you had to build, right? Like, you know, we have this crazy + +00:03:43 research machine. There's three of them in the world and here's the data that comes out of it. + +00:03:48 how do I put that into notebooks or whatever, right? + +00:03:51 Yeah, yeah, exactly. + +00:03:53 And it's funny you mentioned that. + +00:03:55 I think starting off in more like the visualization side, + +00:03:57 we'd build these very bespoke and custom applications for looking at large biological data sets. + +00:04:03 And then I think a motivation for me was realizing a lot of the folks that I was working with + +00:04:07 to build these tools for worked in Python and worked in notebooks specifically. + +00:04:11 And so there was sort of this gap between the research that we were doing to build these specialized tools + +00:04:16 and trying to meet users where they were at to actually use those tools in their research + +00:04:20 and as a part of their data exploration workflows. + +00:04:23 - And this was your PhD work? + +00:04:24 - Yes, yeah, this was my PhD work. + +00:04:26 - Incredible. + +00:04:27 So I'm sure it wasn't just Python. + +00:04:30 - No, so-- + +00:04:31 - What was involved with it? + +00:04:32 - So yeah, I'd say the research community, at least in bio, is sort of split between R and Python, + +00:04:39 but specifically some of our closest collaborators were in the Python ecosystem. + +00:04:43 And then my team was building a lot of user interfaces for visual exploration of data sets. + +00:04:49 And I think this notebook sort of became this space and specifically Python notebooks + +00:04:54 where you can really blend those programmatic and user interfaces together. + +00:04:59 That's sort of why I centered on trying to solve some of the rough edges + +00:05:03 about bringing these interactive tools inside of notebooks. + +00:05:07 - That's cool. + +00:05:08 Did you do any other platforms? + +00:05:11 I'm thinking like Unity or Reel, a real engine for like flying through data or what else did you all do? + +00:05:20 Yeah, there were some folks on my team that ended up or are currently working in more like spatial, + +00:05:25 like AR, VR type scenarios. + +00:05:27 And that I think is based on the Unity engine. + +00:05:30 Most of my work was like web-based and we, and specifically, and we'll probably get into it. + +00:05:34 I think the web is just a really important platform to develop for in the sense that, + +00:05:39 I mean, we're recording right now in a web browser. + +00:05:42 It's a very capable platform that everyone sort of has an entry point to on some sort of device. + +00:05:47 And so developing for the web often means that you can reuse a lot of that application amongst different devices and in different contexts. + +00:05:56 It's unbelievable what the web can do these days. + +00:05:58 I made the comment before we started, like, I can't believe this is a web browser that we're building, that we're doing this live streaming, video sharing, local recording. + +00:06:08 All this stuff is just like, it's so wild. + +00:06:11 And I built an app recently that I'm using to kind of help out with some of the interview + +00:06:14 stuff. + +00:06:15 And it's on the web. + +00:06:15 It just runs on my iPad. + +00:06:17 I mean, it goes underappreciated some of the time, I think, of just how far you can push + +00:06:22 it. + +00:06:22 It's not just documents and Ajax. + +00:06:25 Yeah. + +00:06:25 Yeah. + +00:06:26 The fact that a version of Photoshop runs in the browser is pretty amazing. + +00:06:30 And Figma itself is all running in the web browser. + +00:06:33 So it's definitely a capable platform. + +00:06:35 But one thing it's not very good at is, or doesn't have the ecosystem for, is a lot of + +00:06:40 data work. + +00:06:40 And I think that that's really the big divide. + +00:06:42 So I see the web is really like one of the most accessible platforms + +00:06:46 for building like user interfaces and like applications. + +00:06:50 But then there's this whole other world, which is like tools that you actually do data science stuff in. + +00:06:54 And that is not the web whatsoever, but has like a similar amount of maturity + +00:06:58 in terms of being very ergonomic and good at doing that type of work. + +00:07:02 - Right, and those two worlds meet in, they meet in the web, right? + +00:07:08 But the people doing the Python work, they aren't necessarily web developers. + +00:07:13 You gave a talk a year ago about anywidget that I'll definitely link to. + +00:07:18 And you have this graphic here like, okay, we've got the front end, which often is Jupyter. + +00:07:25 Not always, as we'll see in a minute. + +00:07:27 But often is Jupyter with the way you build for it is with all these web front end tools + +00:07:32 that even me doing web development is a lot of times I'm like, what is this? + +00:07:36 And why is it so complicated? + +00:07:38 I just don't even know. + +00:07:40 And I do web stuff all the time. + +00:07:42 And then you've got the backend stuff where Python people live doing NumPy, + +00:07:46 pollers, AppLotlib, et cetera. + +00:07:48 Do you want to riff on that challenge a little bit? + +00:07:50 Because I know your project is really here to kind of solve some of this disjointedness. + +00:07:54 Yeah, I think one interesting thing, like taking a step back, + +00:07:58 and I mentioned this in the talk that I gave a while back, + +00:08:01 but like the web and Python ecosystems have been around for almost the same exact amount of time. + +00:08:06 Like I think like the first webpage was around the same year that the first release of Python came out. + +00:08:10 And so they sort of have been evolving in their own respective, + +00:08:15 or developing their own respective ecosystems for that period of time. + +00:08:18 And then as the web became really this platform for building user interfaces, + +00:08:22 I think that some pretty visionary folks saw that you could connect + +00:08:28 these backend resources to these very interactive applications + +00:08:31 that could be developed very quickly. + +00:08:34 So over the years, these things that started off very far apart + +00:08:36 have now come very close together. + +00:08:38 And now with WebAssembly, sometimes it's just all the, like turtles all the way down, + +00:08:43 it's all actually running in the browser. + +00:08:44 So it's pretty amazing. + +00:08:46 But I think that there definitely is this friction. + +00:08:49 And one thing I've also noticed from being in both these, + +00:08:51 the web and Python ecosystems for so long is sort of these like ebbs and flows of like maturity + +00:08:56 or like places that need to catch up to one another. + +00:08:59 And so I came from definitely like the website originally. + +00:09:03 And there's a lot that I took for granted, I guess, and that ecosystem of tooling that for a while, I think, + +00:09:09 gave the language a very bad rap because it was sort of the scripting language that was invented, + +00:09:13 I think, famously in two weeks. + +00:09:16 And now people are trying to build applications in it. + +00:09:18 But over the years, there have been standards and things have matured. + +00:09:23 And actually, the web is very backwards compatible and has some of these properties that are kind of amazing + +00:09:28 for something that's been around for that long. + +00:09:30 And then I'd say in the last couple of years, we've seen a bunch of new developer tool work going into the Python ecosystem that in some part + +00:09:39 is also inspired by some of the tooling in the web ecosystem. So I love to see how these things + +00:09:44 go back and forth. And I think if you haven't checked in on the ecosystem for a while, + +00:09:49 you might think it isn't what it was five years ago as it is today. And one of the goals of AnyWidget + +00:09:55 is to try to demystify some of maybe those prior bad experiences folks had developing and just + +00:10:01 you know, get back to like opening up some HTML and writing some JavaScript and getting your Python + +00:10:07 data in the browser. Yeah, absolutely. You know, I wonder, as you describe it being made in two + +00:10:13 weeks, which I also think this is true, when it was made, I think, at Mozilla, right? Yeah. Anyway, + +00:10:21 one of the JavaScript's real issues is it doesn't have a real numerical system, right? Everything is + +00:10:28 afloat. That has a lot of problems for data science when it really, really has to be integers + +00:10:33 or whatever, right? Like you need more control over the numbers. I wonder how much of Python's + +00:10:39 benefit in the data science space originates from people going, JavaScript's super popular, + +00:10:43 but can't do numbers there. I do science, can't do JavaScript. What am I doing now? You know what + +00:10:48 I mean? What do you think about that? It just occurred to me. Yeah, it's interesting. Like that + +00:10:52 is originally like a constraint of the language. And maybe, you know, when you're picking ecosystems + +00:10:58 on. Also, there was no server way to run JavaScript until 2010. So the idea of just running this + +00:11:04 without a browser was also not very good. So yeah, I'm not quite sure, but it's funny because + +00:11:11 JavaScript has sort of reacted to that by adding some big ints as a data type. But also now there's + +00:11:17 sort of like a fourth language for the web, which is WebAssembly, and there are proper data types + +00:11:22 in WebAssembly. So I guess it is sort of reactionary, but I haven't really seen. So now you can run + +00:11:28 Python in the browser. And, but you'd have to have all this like ecosystem grift or migration that I + +00:11:33 just don't think anybody wants. So it's more, maybe these ecosystems can play a lot nicer together. + +00:11:39 Yeah. Yeah. And certainly they are with things like Jupyter and stuff, right? Like most users + +00:11:44 of Jupyter don't write JavaScript. Yes. And I think that everyone's happy about that. So + +00:11:49 exactly. That's true. There's a lot of people are like, you know what, that's not for me. + +00:11:52 So, you know, you were at Harvard doing your work and now you're at Marimo? + +00:11:58 Marimo? + +00:11:59 I always mess up saying this. + +00:12:01 Tell us how to do it. + +00:12:02 It's Marimo. + +00:12:03 Marimo. + +00:12:04 Okay, Marimo. + +00:12:05 Yeah. + +00:12:05 And I'd actually on, and I know we talked a lot about it and he set me straight and then + +00:12:10 I drifted, I'm sure. + +00:12:11 So what's your experiences here? + +00:12:14 Like, how's it coming from academia to this world? + +00:12:17 It's been great. + +00:12:18 I think that having worked with, so part of AnyWidget was trying to address some user needs working inside of Notebooks. + +00:12:25 And there's only so much that you can do as a plugin to get that extra bit to help users. + +00:12:32 And so I think one really exciting thing about working on this team with Marimo is they got very involved, + +00:12:37 or they were aware of AnyWidget a while ago, and they kind of actually legitimized AnyWidget as, + +00:12:43 hey, this is a standard because we were able to implement around this specification. + +00:12:48 And so that got us together. + +00:12:50 And then from there, it's just been super exciting to work with a team that is so passionate about notebooks and thinking about this next generation of notebooks and what that environment needs to look like. + +00:13:00 And so there's much more that we can do inside of Marimo beyond the AnyWidget specification, but also AnyWidget is an important component of that. + +00:13:07 So I like to be able to sort of think about both of these two worlds. + +00:13:11 That's cool. + +00:13:11 So when I use widgets in Marimo, it's often an AnyWidget? + +00:13:17 Yep. Yeah, exactly. So if when folks usually come and they want to like extend Marimo with their own custom UI elements, like our answer is you should make an AnyWidget. So that's pretty exciting. But then if you're using things in like the sidebar or things that like sort of integrate outside of like the notebook view, those aren't more custom elements that we've created at Marimo that aren't necessarily based on AnyWidget. + +00:13:42 This portion of Talk Python To Me is brought to you by Sentry. + +00:13:46 Let me ask you a question. + +00:13:48 What if you could see into the future? + +00:13:50 We're talking about Sentry, of course, so that means seeing potential errors, crashes, and bugs before they happen, + +00:13:56 before you even accept them into your code base. + +00:13:59 That's what Sentry's AI Sears Code Review offers. + +00:14:03 You get error prediction based on real production history. + +00:14:06 AI Sear Code Review flags the most impactful errors your PR is likely to introduce before merge using your app's error and performance context, + +00:14:16 not just generic LLM pattern matching. + +00:14:19 Seer will then jump in on new PRs with feedback and warning if it finds any potential issues. + +00:14:25 Here's a real example. + +00:14:26 On a new PR related to a search feature in a web app, + +00:14:29 we see a comment from Seer by Sentry bot in the PR. + +00:14:34 And it says, potential bug, the process search results function, + +00:14:38 can enter an infinite recursion when a search query finds no matches, + +00:14:43 as the recursive call lacks a return statement and a proper termination condition. + +00:14:48 And Seer AI Code Review also provides additional details + +00:14:51 which you can expand for further information on the issue and suggested fixes. + +00:14:56 And bam, just like that, Seer AI Code Review has stopped a bug in its tracks + +00:15:00 without any devs in the loop. + +00:15:02 A nasty infinite loop bug never made it into production. + +00:15:06 Here's how you set it up. + +00:15:07 You enable the GitHub Sentry integration on your Sentry account. + +00:15:11 Enable Seer AI on your Sentry account. + +00:15:13 And on GitHub, you install the Seer by Sentry app and connect it to your repositories that you want it to validate. + +00:15:19 So jump over to Sentry and set up code review for yourself. + +00:15:23 Just visit talkpython.fm/seer-code-review. + +00:15:27 The link is in your podcast player show notes and on the episode page. + +00:15:30 Thank you to Sentry for supporting Talk Python To Me. + +00:15:34 Why Marimo? + +00:15:35 I know I had Axion people talked about. + +00:15:37 When I look at this, it feels like a super modern UI that is just, + +00:15:41 it's got a lot of polish and a lot of, it feels 2025 and working with it, right? + +00:15:48 And it solves some of the, I consider Jupyter Notebooks like the world's most insane, + +00:15:54 not the world's most insane, the second most insane series of go-to statements + +00:15:58 that have no record of how you got the go-to, you know, it doesn't even say go to 10. + +00:16:02 It's like, well, what did you go and run? + +00:16:04 I don't know, like, look at the numbers, but there's like potentially, you know, + +00:16:08 some lost as you rerun them. + +00:16:10 And it solves that problem as well. + +00:16:11 But like, you know, what's the elevator pitch for Marimo? + +00:16:13 Yeah, my elevator pitch for Marimo is that I think notebooks are incredibly important + +00:16:19 and that it's hard to deny that like they're a tool that many folks use to do their daily work. + +00:16:25 But there are very few like guardrails to help you do that work and work on a team with other folks. + +00:16:30 And so I think one of the big selling points to me with Marimo + +00:16:33 is just that you sort of get to free yourself from thinking about those go-to statements, + +00:16:38 and instead you have this very deterministic execution that sort of just feels natural. + +00:16:44 As you write cells, they will re-execute. + +00:16:47 And then on top of that, I think that Jupyter and Marimo have slightly different goals. + +00:16:51 So Jupyter, when I showed that diagram, we had that diagram before, + +00:16:54 you have this split between these diversity of front ends. + +00:16:58 So that's like you could use it in Colab, you could use a Jupyter kernel inside of VS Code, + +00:17:02 you could use a Jupyter kernel sort of within the Jupyter CLI. + +00:17:06 But then there's all another requirement of that ecosystem + +00:17:08 is to support many different kernels. + +00:17:10 So it's not just Python, it could be an R kernel, it could be a Julia kernel. + +00:17:14 And so I think ultimately attention that ends up happening + +00:17:17 is like a priority of that ecosystem is to support all these languages + +00:17:20 and all these different front ends. + +00:17:21 And instead, by just focusing on Python, we can really try to develop a really integrated + +00:17:27 and rich experience specifically for Python notebooks. + +00:17:30 So the trade-off there being some folks come and they say, + +00:17:33 oh, we'd love Marimo, but for R. + +00:17:34 And we're like, well, we can't really do that. + +00:17:37 It's called JupyterLab. + +00:17:38 Yeah, but we are hyper-focused on Python and that lets us integrate with other trends within the ecosystem too. + +00:17:44 So we're a huge fan of all the work from the astral folks. + +00:17:47 And we have a tight integration with uv, for example, to allow you to sort of use that PEP 723 script metadata. + +00:17:54 And as you're working, we'll like install packages for you + +00:17:57 and write that metadata so that at the end, you can send that document to someone else + +00:18:02 and they can like bootstrap their notebook and get all their dependencies + +00:18:04 and have their environment set up. + +00:18:06 And I think that type of like last mile with these notebooks + +00:18:10 is something that's been hard to get at without really focusing on like an ecosystem. + +00:18:14 That last mile though, that's a lot of polish. + +00:18:16 And that's often what kind of gets lost in projects where, I don't know, it's not exactly the focus, right? + +00:18:25 It's something where you want to take on the most interesting features + +00:18:28 and it's not like it's necessarily a company and there's somebody who's like, + +00:18:32 no, we're polishing every little rough edge, period. + +00:18:35 You know what I mean? + +00:18:36 And I think you mentioned Astral and uv. + +00:18:38 I think that is an example of they got funding and it's like, okay, we're not going to do 95% of the packaging. + +00:18:45 We're going to do 99.9% of the packaging. + +00:18:47 You know what I mean? + +00:18:47 And it's made a tremendous difference. + +00:18:50 Yeah. + +00:18:50 Yeah, I mean, I have plenty to say about Astro. + +00:18:53 We love all their tooling over here too. + +00:18:56 Yeah, so do I. + +00:18:56 I switched all my stuff to Astro and it's to uv and rough and yeah, super, super neat. + +00:19:02 Okay, so another thing that you all just released is a new VS Code plugin. + +00:19:08 So I guess one of the aspects of Marimo that is not necessarily apparent from looking at + +00:19:13 it going there is that it's backed by a Python file, not a JSON file that contains both the + +00:19:18 inputs and outputs, which is like one of the big shortcomings of Jupyter. Like how much of Jupyter + +00:19:24 would have benefited if it just had an input in an output file? So you don't check in the output + +00:19:29 file potentially. You know what I mean? Yeah. Yeah. We, I think that was pretty visionary or maybe it + +00:19:34 was just, it's more that we have this ability to observe what's happened in the Jupyter ecosystem + +00:19:40 and make adjustments from there. A little bit of a second mover benefit of like, we saw that, + +00:19:44 That was 80% good. + +00:19:45 Definitely. + +00:19:46 Yeah, it was surprising. + +00:19:48 I mean, I think to rip more on Marimo, just there are all these, + +00:19:52 when people ask that question of like why Marimo, I think there are all these like very small things, + +00:19:55 but like that in net, I think are certainly like a order of magnitude kind of like richer experience + +00:20:02 for our users. + +00:20:04 But it's funny how many times just the fact that like our notebooks are get diffable, + +00:20:08 that's like one of the first things that folks like really latch onto + +00:20:11 and love about Marimo notebooks. + +00:20:13 Yeah, yeah, very cool. And so the VS Code extension, what's the, like, how's this, how's this help us? + +00:20:19 Yeah, so I see this as, I think the best experience with using Marimo will always be sort of our CLI or like our, our web interface that we've like really curated around this experience. + +00:20:31 But we have a broad range of users that want to collaborate or work on Maremo documents. + +00:20:36 And so some of them are like, you know, I live in my VS Code, my cursor, my Windsurf, my IDE, and like I never want to leave this environment. + +00:20:44 And so getting them to come and like collaborate on these documents has been a little bit of a barrier there. + +00:20:49 So they could, of course, edit a Python file on disk, but having like a native notebook view like they do for Jupyter was something that we sort of lacked in those environments. + +00:20:57 And so we made it a priority shortly after I joined the team where we really wanted to address that user base of folks that want to live in that editor environment. + +00:21:06 And sort of from the ground up wrote this new extension where we have a very integrated view with sort of reactive outputs, similarly to what we have in our user interface. + +00:21:16 But it sort of lives inside of the VS Code ecosystem and uses the VS Code APIs. + +00:21:22 Yeah, that makes a lot of sense. + +00:21:24 and it's available on the, what is it? + +00:21:25 VSX open marketplace for like cursor and anti-gravity. + +00:21:30 That's a thing as of like three days ago. + +00:21:32 Yes, all these different skins of VS Code, our extension should work as well. + +00:21:38 And we publish it to both marketplaces. + +00:21:40 So folks should be able to install it and get started. + +00:21:42 It's a weird time for editors. + +00:21:44 Yes. + +00:21:45 It was really kind of straight. + +00:21:47 When I first started Talk Python 11 years ago or whatever it was, + +00:21:50 I would ask people what editor they use at the end of the show + +00:21:53 I didn't know what they were going to say. It could have been anything. And then it just narrowed + +00:21:58 down to VS Code, PyCharm. Those are the answers. And now it's kind of back to a lot of options + +00:22:05 again. But it generally works in the VS Code variants as well. All right. Super cool. Okay. + +00:22:12 Let's talk about just what is a Jupyter widget? Yeah. So like I mentioned before, Jupyter really + +00:22:18 reaches, and Marimo also, reach across this chasm between your Python code that's running in some, + +00:22:23 we call it like a kernel environment that's like a stateful environment that has your variables and + +00:22:26 when you run cells like things update there and then on the front end there's like the cell outputs + +00:22:31 that um are reflection of like whatever is sort of like living in the kernel and a jupiter widget + +00:22:36 is like a very interesting part of that like uh of that uh crossing of that chasm and that + +00:22:42 you can actually author both the front end code and the back end code to talk to one another so + +00:22:46 typically it's like you're if you write a python library you're only writing something that runs + +00:22:50 on the server. And then if you're making a Jupyter extension, for example, that's something that only + +00:22:54 lives in the front end. But a widget allows a library author to connect those two worlds together. + +00:22:59 And so for your small object that lives in the Python kernel, you can define ways that it can + +00:23:05 render itself and also be updated from that front end. And so it's a very powerful mechanism that's + +00:23:11 officially a part of Jupyter that allows you to extend your output cells with, I like to think of + +00:23:17 it as these web-based superpowers to be able to update your code in the kernel. + +00:23:21 You have a really interesting point that you made in your SciPy talk, saying basically, + +00:23:27 we can do lots of UI stuff in notebooks, but often those are kind of like isolated snapshots, + +00:23:35 right? Maybe you have a graph and there's like outliers or something along those lines, + +00:23:40 but you can see them, but you can't like programmatically interact with them in the + +00:23:45 way that you could a widget, whereas you might drag and select to get a zoom area that actually + +00:23:50 becomes a data frame, or you could move some widgets that then redraw the picture, but also + +00:23:56 becomes something you could program against in like a machine learning algorithm or whatever, + +00:24:00 right? Maybe I think that's a pretty powerful distinction that people want to have in mind as + +00:24:05 we dive into anywidget. Yeah, definitely. I like to think of it as like in a traditional like + +00:24:10 REPL kind of environment, like your two modes of input are sort of like writing code and running + +00:24:15 so you can write the code and you can run the cell and then you can observe the output but in order + +00:24:19 to like act on something that you see in that output you have to write more code so um if you + +00:24:24 have a plot for example that you produce from one of those outputs if there's some outliers or + +00:24:29 something a very natural thing to think is what is that point and you want to like circle that point + +00:24:33 and that is much easier to do with a gesture than it is to express as some sort of like + +00:24:38 query in in uh typing out some program and kind of looking at the axes to say like what are those + +00:24:44 points there. And so I think widgets really allow for you to be creative in terms of how you + +00:24:51 might like another mode of input for adjusting things that might be in the kernel. And like + +00:24:56 you're saying, now you can transform a selection into just a data frame that you get out and now, + +00:25:00 and then you're off to the races with running the next part of your program. And so I think it, + +00:25:04 it adds this degree of freedom to like let the algorithms and data, like use code when that's + +00:25:09 useful. And then for those pieces where you want to step in as a human and maybe use a gesture + +00:25:14 and that's easier to express as a gesture, then we have that mechanism now + +00:25:18 to allow you to extend your analyses. + +00:25:21 Right. + +00:25:21 If you don't need pictures and interaction, just do it in Python straight. + +00:25:26 You know what I mean? + +00:25:27 You don't need notebooks. + +00:25:28 And part of the value of notebooks is this mix of like a visual plus code plus visual + +00:25:35 and this sort of iterative back and forth between those. + +00:25:38 And widgets reconnect the visual back to the code, I guess, right? + +00:25:42 Yeah, exactly. + +00:25:43 I like to think of it as the types of workflows in notebooks + +00:25:47 are not typically batch scripts. + +00:25:49 I mean, in Marimo, you can write your notebook and run it as a script because it's just Python. + +00:25:54 But often when people are working and really getting their hands on data, + +00:25:57 these are long-lived sessions where you have a lot of data in memory + +00:26:01 and you really want to get your hands on that data. + +00:26:03 And I think if you only had the ability to write and run cells and not add that extra dimension, + +00:26:10 it would be pretty limited to what you could actually do. + +00:26:12 And so something like widgets allows you to really extend and be creative with how you'd like to explore that data and sort of like debug your data as it's in memory and get your hands on it. + +00:26:22 Yeah. + +00:26:23 Yeah. + +00:26:23 I mean, that's what blew the world open when all that came around, right? + +00:26:27 Yeah. + +00:26:27 So that brings us to anywidget. + +00:26:30 Why a new widget framework? + +00:26:32 Why not just IPy widgets or whatever? + +00:26:36 Yeah, it's a great question. + +00:26:37 And it's something that when I started my PhD, like I was not, I guess I didn't really have a hand on either. + +00:26:42 But AnyWidget was definitely born out of a need of myself to like solve a problem that I was running into during the PhD. + +00:26:49 So as I mentioned before, we had all these visualization tools that we were building sort of in. + +00:26:54 And specifically, one of the tools I was building was a like a web-based genome browser. + +00:26:59 And so this is something that people put their genomes into and you can align different tracks and you can zoom in and try to understand like different types of things. + +00:27:07 types of functional assays in genomics. + +00:27:10 And then we had users that wanted to use these inside of computational notebooks. + +00:27:14 And one thing that immediately happened when we tried to build these integrations for notebooks + +00:27:19 is that you'd spend all this engineering effort trying to build an integration for a specific notebook. + +00:27:24 But then someone would come in and say, oh, I'd actually want this to work in Google Colab, + +00:27:28 or I need this to work in VS Code, or I need this to work in this other environment. + +00:27:34 And all of those environments are very similar on the back end + +00:27:37 but they're very different on the front end in terms of the way that you can register like front end code. + +00:27:41 And so it ended up putting a lot on a library author like myself + +00:27:45 to try to build those adapters for each of those environments. + +00:27:48 And so anywidget was sort of an attempt to standardize maybe how we can write that front end code + +00:27:54 such that it's more, and then we can build those adapters in one place + +00:27:57 that then we can support all those different environments. + +00:27:59 Right, right. + +00:28:00 We talked about building the widgets largely in JavaScript earlier already, + +00:28:03 but I think what people maybe who haven't tried this don't realize is how many variations you would have to build to say, have I also, I wanted to + +00:28:13 work in Jupyter and I wanted to work in Colab and you've got to build and publish the PyPI and NPM + +00:28:19 and right. There's like, so everyone who built one of those had to, and wanted to cross use, + +00:28:25 they had to do all those adaptation, adaptation. So, and your thought was basically, well, + +00:28:30 I could create an adapter that everyone could use and then we don't have to do this ever again. + +00:28:35 Yeah, exactly. And to what you're saying as well, I think that adapter craziness that's in the web ecosystem of like, oh my gosh, I had to transpile my code into all these different formats. + +00:28:49 It's kind of a nightmare in terms of tooling. And it's something that came as a result of the state of the JavaScript ecosystem being pretty immature before 2018. + +00:29:01 And basically because JavaScript didn't have a module system. + +00:29:04 And so everyone had to come up with a way to bundle code. + +00:29:07 And there are all these third-party adapters. + +00:29:08 They're figuring out a way that they could load JavaScript modules. + +00:29:12 And so whatever these tools chose and the way that they chose to load their front-end code + +00:29:17 would affect the way that me as the library author would need to package up the code. + +00:29:21 By modules and loading, you're talking where people might see the word require at the top + +00:29:26 or something like that. + +00:29:27 Kind of like our import. + +00:29:29 Yes, exactly. + +00:29:29 It's very much like import statements in Python. + +00:29:32 And in fact, the official, if you open up your browser, you can just type, you can use a syntax that's called ES modules + +00:29:39 and type in import. + +00:29:41 And you can import a bit of JavaScript code via URL. + +00:29:45 And this just works natively in the browsers like today, but wasn't sort of standardized until late 2015, or late 2018. + +00:29:54 So you mentioned require, that's actually a different module system + +00:29:58 that's not based in browsers. + +00:30:00 And if you were to type require in your browser, that would not work. + +00:30:03 And that is sort of like the tension that was the JavaScript ecosystem. + +00:30:06 So I was aware of some of the trends and things that have happened + +00:30:09 post this new module system because I've been working in the web ecosystem. + +00:30:13 And yet when I came to work on like Jupyter widgets specifically, + +00:30:15 there was like, I felt like I was writing code + +00:30:18 like I had been like a while ago for the browser. + +00:30:20 And so the idea behind anywidget was to like, let's just simplify this so like that my source + +00:30:25 and the code that I publish is more like what the browser understands natively. + +00:30:29 And then we take care of this like transpilation stuff once for everybody and package that up + +00:30:34 inside of anywidget to deal with sort of like the legacy module systems of like JupyterLab, + +00:30:39 VS Code, and a Google collab. + +00:30:42 This portion of Talk Python To Me is brought to you by JetBrains and the PyCharm team. + +00:30:47 The PSF, the Python Software Foundation, is a nonprofit organization behind Python. + +00:30:53 They run or oversee the conferences, handle all the legal business to do with Python, + +00:30:58 oversee the steering council and other groups, and much more. + +00:31:01 But let's focus in on one key word here, non-profit. + +00:31:05 Unlike some software ecosystems, which grew out of and are supported by a single tech giant, + +00:31:11 think.NET and Microsoft or Swift and Apple, Python is by developers for developers. + +00:31:18 That makes it a truly special place. + +00:31:21 Everyone here is here because they chose to be here. + +00:31:24 It's our garden. + +00:31:25 and we have to tend it. That means supporting the ecosystem, and a big part of that is supporting + +00:31:31 the PSF. That's why I was thrilled when JetBrains and the PyCharm team in particular reached out to + +00:31:36 me to help spread the word about their PSF fundraiser. Here's the deal. You can purchase + +00:31:41 PyCharm Pro for 30% off this week until December 19th, 2025. And when you do, not only do you get + +00:31:48 a deal on PyCharm, but the PyCharm team will donate all the proceeds to the PSF. Let me say + +00:31:54 that again because it's easy to miss. They will donate all the proceeds to the PSF. And a positive + +00:32:00 addition this year is that this also applies to renewals, not just new customers. So if you're + +00:32:05 already a PyCharm fan and customer, renew this week and you get a discount plus you support the PSF. + +00:32:12 To take part in this, you'll need to be sure to use Talk Python's exclusive promo code, + +00:32:17 STRONGERPYTHON, all caps, two words. That's STRONGERPYTHON for the code and the link is + +00:32:22 talkpython.fm/PyCharm dash PSF dash 2025. Both of these are in your podcast player's show + +00:32:29 notes. Thank you to PyCharm for supporting the show. I'm trying to decide if I feel like it's + +00:32:36 in the interest, if the motivation of any of those platforms would have been to create something like + +00:32:41 this, right? Like the Jupyter people are like, I don't know the problem. You just write for Jupyter + +00:32:44 and it works. And the Colab people are like, I don't see a problem. You just write for Colab and + +00:32:49 it works, right? Like what is the platform people's motivation to create integrations and smooth the + +00:32:56 transition to the other ones? That's one tension. On the other though, they might have more stuff + +00:33:00 come to them if they make it more reusable across them, right? So it's interesting that none of them + +00:33:05 actually did that. Yeah, I think, I mean, in hindsight, it makes sense why someone in my + +00:33:09 position, like, I guess, focused on this because it was like, I had two widgets and then times the + +00:33:14 number of platforms. And then if something broke, you'd have to go fix it a bunch of different + +00:33:17 places and it just becomes a maintenance nightmare. + +00:33:22 But one thing that's been really interesting is I think that there is a good pressure to + +00:33:26 support a healthy ecosystem. + +00:33:29 So I'd say before anywidget, it was pretty complicated to create custom widgets. + +00:33:34 And so one of my goals was I want the feeling of being able to author a widget to be very + +00:33:39 similar to the way that you can copy and paste code into a notebook, but move it into a file. + +00:33:44 And then you could publish that to PyPI. + +00:33:46 And that workflow always worked for Python, but there was no way to start prototyping a widget + +00:33:51 inside of a notebook before and then move it out. + +00:33:54 And I really wanted to bring sort of, to lower that barrier to entry. + +00:33:58 And then what I think came out of that is you get a much richer ecosystem of people + +00:34:01 that are willing to try and make things. + +00:34:05 And then when there's a cool widget that comes out, then that's a good positive pressure + +00:34:08 for other ecosystems, because then people are trying to request, + +00:34:11 they go, hey, I want this widget to work in your environment. + +00:34:13 And that puts more pressure on various environments to implement sort of a more standardized approach. + +00:34:19 Or adopt a adapting layer like anywidget. + +00:34:22 Yep. Yeah, exactly. + +00:34:24 Yeah, to sort of back you up here, Kostal says, thank you for building anywidget. + +00:34:28 Having gone through creating an IPy widget, it was a lot of work. + +00:34:32 So yeah, exactly. + +00:34:35 Let's just take a tour through the widget gallery. + +00:34:39 Like what widgets are available here? + +00:34:42 Got some favorites? + +00:34:43 Yeah, we definitely have some favorites. + +00:34:45 I would say that one of the early adopters of AnyWidget is a fairly popular plotting library called Altair. + +00:34:53 And it allows to do exactly what you were talking about earlier + +00:34:56 with selecting points and allowing you to get those back as data frames in the kernel. + +00:35:01 So for a while, the way that Altair worked, and I think it still by default works inside of a Jupyter environment, + +00:35:09 is that it creates sort of an interactive output that isn't connected back to the kernel. + +00:35:14 So you can get your output. + +00:35:15 It feels interactive because you can zoom in and you can select and you can do all the things, but it's just a view. + +00:35:21 Exactly. + +00:35:22 That state, I like to think of it as like is trapped in the output and you can't get it back in the notebook. + +00:35:27 And so what the AnyWidget does behind the scenes is it allows for that output to communicate back with the kernel, + +00:35:34 which then allows you to update an object or a selection and then run your cell again and view some output. + +00:35:40 And just that bi-directional communication between your kernel and your front end + +00:35:45 allows you to do things like create a data frame that is updated when you select. + +00:35:50 Yeah. + +00:35:51 Nice. + +00:35:51 Yeah, it's sort of that example we talked about the outliers before, right? + +00:35:56 Yeah, exactly. + +00:35:58 Altair's super cool. + +00:35:59 You know, I talked to Jake Vanderplass about it when it first came out and it's very beautiful. + +00:36:04 Yeah, Altair was actually, I think, how I got into open source contributions forever ago. + +00:36:09 Like I made like documentation examples and it was very like, + +00:36:13 it was very fun to come full circle to actually have like a dependency + +00:36:17 that somehow like got back into that library many years later. + +00:36:20 Now you build something that makes it more, that supercharges Altair + +00:36:24 because now it has like bidirectional data integration + +00:36:27 sort of thing. + +00:36:27 Yeah, yeah. + +00:36:28 Yeah, that's wild. + +00:36:29 Yeah, maybe just talk to people really quick about that. + +00:36:30 Like you don't have to write a revolutionary feature to be part of open source. + +00:36:35 Yeah, yeah, definitely. + +00:36:36 I think my entry to open source was just I got interested in a plotting library and they had some open tickets for making examples for their documentation. + +00:36:45 And I wanted to learn how to use the plotting library. + +00:36:48 So then I started contributing to them. + +00:36:50 And it was really like those interactions, I think, with maintainers that kind of got a tick in my head where I was like, oh, I think I like this way of working and communicate. + +00:36:58 Like there's a lot of, I don't know, oftentimes the challenges in open source aren't so technical as they are just social and figuring out how to communicate expectations to different users. + +00:37:07 And I think specifically interacting with Jake Vander Plaats + +00:37:10 and some of those issues, I think I learned a lot about that + +00:37:15 and I was attracted to trying to find new ways to work on problems in open source. + +00:37:20 Yeah, awesome. + +00:37:21 All right, well, that's number one with 10,000 favorites out of the community. + +00:37:27 What are some others? + +00:37:28 Some of the other, I would say like, so Vincent Womerdam, who is one of my colleagues + +00:37:32 and you've had it on the podcast any times before. + +00:37:34 A couple times, yeah. + +00:37:35 Yeah, it's created this draw data widget. + +00:37:38 And I would say when this widget came out, this was very much, + +00:37:41 there are many different, it demos very well. + +00:37:44 So the idea is that you have like a canvas that you can draw some points + +00:37:48 and you get those points back out as a data frame. + +00:37:51 And yes, so Vincent has a nice Marimo notebook that's running in the browser. + +00:37:57 Yeah, that's right. + +00:37:57 There we go. + +00:37:59 And I believe if you draw some points. + +00:38:01 I don't like the brush. + +00:38:02 I'm going to change the brush. + +00:38:03 We've got precise data. + +00:38:04 So you could have one kind of data set here and then you could go like, okay, we're going to do, + +00:38:10 you know, this is super interesting because maybe you say like the data kind of looks like this and + +00:38:14 I want to run an algorithm against it, but there's a ton of work to get the data in the right format. + +00:38:18 You could just start, you know, kind of just visually doing these things. And then you can + +00:38:24 go on and analyze it and do all sorts of stuff by just literally by drawing. + +00:38:30 Yeah. We found this is like educators are quite excited about like this kind of widget. + +00:38:34 because it's like, hey, here's this, how does a random forest algorithm work? + +00:38:38 It's like, okay, we can just draw a data set and then we can actually view + +00:38:41 how a classification would work over a specific type of data set. + +00:38:44 And I think it's really that type of interplay between something that you can play with + +00:38:49 to create something with the data and then maybe you take scikit-learn or something + +00:38:53 and apply an algorithm and you can help build intuition + +00:38:56 for how these algorithms work just by plugging in with the actual workhorse + +00:39:00 of doing the operations and like computing, like your classifier. + +00:39:04 But instead you have this new input, which is like allowing for a bit more play + +00:39:09 with like learning how those algorithms work. + +00:39:11 Yeah, you have an algorithm and you say, I want to see how, if the data looked like this, + +00:39:17 what would happen? + +00:39:18 If the data looked like that, what would happen? + +00:39:20 And a lot of times people will result to generating that data with code, + +00:39:24 either by writing it themselves, using a library like Faker or something like that, + +00:39:29 you know, like, how am I going to get it? + +00:39:30 What if it looks like this? + +00:39:31 And here you just say, whatever looks like this, and you draw it. + +00:39:34 And then you run your algorithm based on the output of it because it's a widget, an AnyWidget widget. + +00:39:39 And then you just keep going. + +00:39:40 It's super neat, actually. + +00:39:42 Yeah. + +00:39:42 Yeah. + +00:39:43 So to go back to, I think, something you mentioned before as well, + +00:39:48 AnyWidget sort of serves two communities. + +00:39:50 So I think for one, there are folks that never will ever need to touch or learn any JavaScript, + +00:39:55 but can just use these libraries like they would Python packages. + +00:39:58 So the idea is, hey, I can pip install draw data, and now I get to use a scatter widget. + +00:40:02 And I don't care at all how it works, but now I get to understand how this algorithm works. + +00:40:07 But for those that are interested, they can go and learn maybe a little bit of JavaScript + +00:40:12 or progressively a little bit more JavaScript to create something and package it up + +00:40:16 for many different platforms. + +00:40:18 So you sort of have people that want to make libraries and then you have library consumers. + +00:40:22 And depending on how much front-end stuff you want to learn, + +00:40:25 that's how deep you can go into either of those sides. + +00:40:27 But I'd say probably many more people are just widget users and happily wire them together and don't worry about it and move on + +00:40:34 with their day. And then there's some folks that maybe go a little bit deeper on the JS side and + +00:40:39 learn how to create these interactive experiences. When you say wire them together, do you mean like + +00:40:43 put one in a notebook in one cell and then another in another cell, but then say the input of it is + +00:40:48 the output of the other, that kind of wire together? Exactly. Yeah. So you can really think of these as + +00:40:53 like building blocks that you can like compose inside of a notebook environment. And I think + +00:40:57 specifically in marimo a really cool thing is that when you update that value that lives in the + +00:41:02 kernel that cell below will rerun so now it's not just like you do the thing and then you have to + +00:41:07 manually rerun the cell you actually start to build up based of our reactivity graph like a little bit + +00:41:12 of an application of like i select some cells and now that reruns this python code and and if i have + +00:41:18 more cells that run after that it all is sort of wired up from this new input which is your widget + +00:41:23 Yeah, I kind of riffed on this a little bit when I said like the world's craziest go-to. + +00:41:27 And I said the second craziest because the most crazy is Excel. + +00:41:31 But that's because the way Marimo solves that is it actually understands the way these variables + +00:41:38 used across cells. + +00:41:38 So when I move a widget in one, it redraws. + +00:41:41 It's like, oh, the thing that depended upon it also now has to change. + +00:41:44 And then like you can, it cascades that execution across the widgets, right? + +00:41:48 Exactly. + +00:41:49 And that would normally just happen if you use our built-in UI elements or if you rerun cells. + +00:41:54 And because it's just this reactive model where we trace the dependency, we just track what the dependencies are statically between your cells. + +00:42:02 We know that when you update this property on a widget, we know it sells that you have to update that depend on that widget. + +00:42:08 And so, yeah, it just sort of all falls out of this simple idea of your notebooks are a graph. + +00:42:13 They're not just like this manual. + +00:42:16 I think of it as like a lot of like running cells is like manual callbacks. + +00:42:20 Like you, you like do something and you're like, oh, I have to click a callback and run it. + +00:42:24 And by modeling it as a graph, we can run those for you. + +00:42:28 And so that we know exactly what needs to update when this dependency of that part of the graph is invalidated. + +00:42:35 I've talked to some data scientists who are like, that is both a feature and a problem that you can just be so freeform in Jupyter Notebooks, right? + +00:42:42 Like I can just iterate and play. + +00:42:44 And I think while you're iterating and playing, that freedom is great. + +00:42:48 But as soon as you want to start making decisions, then it becomes like a real danger point, right? + +00:42:53 Yeah, I think there's kind of a nice balance. + +00:42:56 So specifically in Marimo, there are some requirements for how you have to write your Python code that are slightly more limiting than within a Jupyter environment. + +00:43:05 And one that often trips folks up is that you can't redeclare variables across cells or else it would be ambiguous. + +00:43:11 Like, is this the cell that defines X or is this the cell that defines X? + +00:43:15 But I like to think of it as- + +00:43:16 Which order did you run it in? + +00:43:18 That's the one. + +00:43:19 Yeah, exactly. + +00:43:20 That's tricky. + +00:43:21 So we're fairly strict on that. + +00:43:23 But what we believe is if you buy into this constraint, then we can give you all these + +00:43:27 properties, which are deterministic execution order and the ability to turn these scripts + +00:43:34 into reproducible, or turn these sort of open-ended workflows into more reproducible artifacts + +00:43:39 that have a deterministic execution order. + +00:43:43 And so I like to think of it a little bit as like, I used to type Python code without type hints. + +00:43:47 Then I started using type hints, and now I can't imagine not having any autocomplete. + +00:43:51 And now I think like maybe our type, like when you start working with data in long-lived sessions, + +00:43:56 having some of these guardrails actually help you keep on track + +00:43:59 such that if you accidentally delete a variable, we'll let you know that you deleted it, + +00:44:03 and it's not something that the next time you boot up the notebook, + +00:44:05 you're just missing that variable. + +00:44:06 So it's something that you buy into, but then I think has all these nice consequences + +00:44:10 that come out that-- + +00:44:12 - As a newbie, you'll see like, these things are not defined, + +00:44:15 or this library doesn't exist. + +00:44:16 And you know what it does exist, you just skipped running the top import cell, + +00:44:20 or you know, that stuff's a little frustrating. + +00:44:23 So during this talk you talked about, during your talk previously, + +00:44:26 you talked about like how anywidget allows you to write + +00:44:30 just enough JavaScript. + +00:44:32 What do you mean by that? + +00:44:33 - What do I mean by just enough JavaScript? + +00:44:35 - Yeah, like, so one person's just enough JavaScript, + +00:44:38 another person's like, whoa, way too much JavaScript. + +00:44:41 Sure. + +00:44:41 So when I first started learning any front-end code, my experience was I opened up an HTML file + +00:44:47 and I just wrote some JavaScript on the page. + +00:44:49 And I completely love this workflow for playing and learning how to write code + +00:44:54 for the first time in the browser. + +00:44:57 But I think over the years, as people have started to build things + +00:45:01 like Figma and Adobe in the browser, there's a ton of tooling that has come up + +00:45:07 to help build those types of applications. + +00:45:09 So many people's first experience with writing JavaScript + +00:45:13 is someone telling them that they need to learn React or learn a specific framework. + +00:45:18 And then it's a bit like, well, this isn't technically JavaScript. + +00:45:21 This is a flavor of JavaScript that we transform. + +00:45:24 And so the experience that I wanted for anywidget was to write this standardized JavaScript code + +00:45:31 as sort of the entry point to your widgets. + +00:45:34 So it should feel as simple as you could just open up browser console and start typing in code. + +00:45:39 And the code that you write is exactly what the browser understands. + +00:45:43 And so it should be dead simple from the beginning. + +00:45:46 Just learning some APIs and probably pattern matching between, okay, this is some syntax. + +00:45:51 But it should feel pretty familiar. + +00:45:53 Should you want to learn those frameworks or use those frameworks, those all have the + +00:45:57 ability to be transformed into this standardized JavaScript. + +00:46:01 And in fact, they have to be in order to run in the browser. + +00:46:03 That's how that all works. + +00:46:05 So I wanted that initial experience to be, if you want to learn and you've never tried it out before, send a variable and you can console log that variable and it just works. + +00:46:15 But should you want to build something very ambitious and a lot of our most popular complicated widgets do use some of these frameworks, that's something that you can learn. + +00:46:23 So I could do a view or an AlpineJS or I could do a React and all of that kind of stuff if I want. + +00:46:30 Yeah, so in our documentation, there is like a, I think it's, I call it, we call it, we have like a little CLI to bootstrap a project that's like ready to be used with React. + +00:46:41 And I think we had a contributor contribute one for Vue and Svelte as well. + +00:46:47 And the whole idea there is like, yes, in order to, like, the tradeoff is that now you are introducing like some JavaScript build tooling into your, like into this process. + +00:46:56 But hopefully if you're familiar with those frameworks, that isn't like a big overhead to trying out some of those things. + +00:47:02 Versus like for the Python beginner that wants to learn something in the front end and they're just trying to get their data into the front end. + +00:47:10 I just don't want them to have to worry about TypeScript or React or any of these things that they might hear about. + +00:47:15 And instead they can just get started with trying to paint some pixels on the screen and then progressively learn outside of that ecosystem. + +00:47:22 So the entry point will always be sort of this simple front end standard. + +00:47:27 And then how far you want to go into that ecosystem, then you can experience more of + +00:47:32 like the tooling there to help with reactivity and things in the front end. + +00:47:37 Yeah, that's super interesting to think of injecting little miniature view applications + +00:47:42 and stuff to allow that work. + +00:47:44 But they are super powerful if you're willing to go through the build steps and all the + +00:47:48 hoops that those different frameworks ask you to do to get it to run. + +00:47:52 And once you get it set up, it's like, okay, these things all bi-directionally data bind to each other internally. + +00:47:57 And so I can see how that plays just like perfectly naturally with the already binding of the dynamic interaction here. + +00:48:04 Yeah, exactly. + +00:48:05 So we model this as like we have the standard, which is like this ECMAScript standard, which is the JavaScript code that you write. + +00:48:14 And then all of the libraries are modeled as like adapters on top of that standard. + +00:48:18 I think we call them bridges. + +00:48:19 So in React, you get hooks that you can call, and then you're writing code that looks like idiomatic React, + +00:48:25 but behind the scenes, it's calling anywidget APIs. + +00:48:29 And our Vue bridge does the same thing, but with Vue APIs. + +00:48:32 So you get to write front-end code that feels like it's Vue-like or React-like, + +00:48:36 but behind the scenes, you have these custom bridges that are written by folks that are familiar with those frameworks + +00:48:41 for how they should interface with our standard specification. + +00:48:44 Yeah, wild. + +00:48:45 Let's talk about not using one of those, but instead talking through building a simple widget. + +00:48:50 Now, just to be clear, I know we can't read all of the code audio version + +00:48:55 because most people just listen to the show, but maybe just give us a sense going through this. + +00:48:59 You have a build a counter widget. + +00:49:02 Yeah. + +00:49:02 Or I can click a button and it counts or something. + +00:49:04 Yeah, just walk us through, just to give people a sense of what does it mean + +00:49:07 to build one of these AnyWidget widgets. + +00:49:09 Yeah, so the idea within AnyWidget is that you create a class that extends from a single class + +00:49:15 that's in anywidget called anywidget. + +00:49:17 And that is what you define both your view, so the front end part of that code, + +00:49:22 and then also the back end code for that. + +00:49:25 So it fully encapsulates the idea of a widget that has both a back end and a front end component. + +00:49:30 Yeah, and just to be clear for people, let's say when you say a class, + +00:49:33 it's not one of those weird prototype JavaScript type things. + +00:49:36 It's the more modern class keyword, but in ECMAScript, is that right? + +00:49:41 Oh, sorry. + +00:49:42 So right, what we're looking at here is just a Python class that's extending. + +00:49:46 Oh, this is a Python class. + +00:49:47 Okay. + +00:49:47 Yeah. + +00:49:48 A Python class. + +00:49:48 Great. + +00:49:49 And so you create a Python class, but then it has the JavaScript. + +00:49:52 Exactly. + +00:49:53 So ESM is like a private field that defines a front end module. + +00:49:58 And the whole idea there is that you define a function that's called render that takes + +00:50:02 in two arguments, a model and an element. + +00:50:05 So the model is like this object that talks back to the kernel. + +00:50:09 And an element is whatever the front end, like Jupyter or VS Code gives you as like the output on the screen. + +00:50:15 And so those two things combined, you can call methods on that model to set a value or get a value. + +00:50:21 And then you can update some part of the UI. + +00:50:24 I see. + +00:50:25 So you're past like a DOM element, which might be a div or whatever. + +00:50:29 And then you can just inject JavaScript, like a pen child and other stuff and set a class and whatever. + +00:50:34 And that builds out the little, the part of the DOM that you control or it's given to you? + +00:50:38 Exactly. And then you can style that however way that you want. + +00:50:42 And then the key thing is that as a part of this adapter, + +00:50:46 when you call methods on this model object that you get, I think there's only a few methods that are on it, + +00:50:51 like get, set, and save changes. + +00:50:53 And that just synchronizes values back and forth between the front end and the back end. + +00:50:58 And that same sort of API is on the Python side as well. + +00:51:02 And so you can react to those values as they update or set them as well on the Python side. + +00:51:07 And then we'll deal for the most part, like with serialization of simple data types to one. + +00:51:12 So if you have a float or an int in Python, we'll just map that to a number data type. + +00:51:16 We have that mapping in our documentation. + +00:51:18 If you want to serialize to something like JSON, that's totally fine too. + +00:51:22 And then you just send that over the wire. + +00:51:24 Yeah, that's wild. + +00:51:25 So how does it actually, so we've got a Python class, + +00:51:28 which I guess probably lives in the kernel. + +00:51:31 And then it's got HTML JavaScript stuff it's doing that it's setting values. + +00:51:37 What's this exchange? + +00:51:39 Where do different pieces run? + +00:51:41 How do they communicate? + +00:51:43 Sure. + +00:51:43 So that is up to the implementation. + +00:51:45 So whoever builds an adapter for anywidget. + +00:51:48 But the idea is that the front-end code that you write is in the standardized form. + +00:51:52 So you can just call import and import this JavaScript module that you have. + +00:51:57 And then the front-end is responsible for when these different methods get called on model + +00:52:03 for sending that data back and communicating over some sort of mechanism to the backend. + +00:52:09 Jupyter already has defined something like that. + +00:52:11 Yeah, so Jupyter has a notion of something called a com, + +00:52:14 and that is implemented over a web socket to communicate back and forth. + +00:52:18 And similar in Marimo, we're not based on Jupyter at all, + +00:52:21 but our any-witted adapter just talks over a web socket with some different messages. + +00:52:26 But basically, we call this a host platform that takes care of that implementation + +00:52:31 and wires up those two ends. + +00:52:32 But then as the widget author, you just care about like, here's my JavaScript module, here's my Python code. + +00:52:37 And they use these APIs. + +00:52:39 And then it's someone else's problem to figure out how to wire those up together. + +00:52:42 Yeah, super neat. + +00:52:43 Kasaasa, interesting thought here. + +00:52:45 I'd be super interested to see this working purely in Python. + +00:52:48 Front end running in Wasm. + +00:52:49 No need for JavaScript streams in Python. + +00:52:51 What do you think here? + +00:52:52 Yeah, yeah. + +00:52:53 I haven't played around with some of the PyDide or the PyScript APIs. + +00:53:00 I think that that would probably be possible. + +00:53:02 the example that you had of draw data before for the Marimo notebook that Vincent had + +00:53:09 that is entirely running in the browser so that is like the Marimo Python kernel is running in Wasm + +00:53:15 and then we're also loading the JavaScript code so we do have folks that just create a Marimo notebook + +00:53:21 and then compile it to WebAssembly and then they have a static website that has both of these bits together + +00:53:27 Okay, that's actually pretty wild, isn't it? + +00:53:31 So I've seen that there's some other places that I can, anywidget, like Myst. + +00:53:36 Yeah. + +00:53:37 And also Mosaic. + +00:53:38 Tell us about these. + +00:53:39 Yeah, yeah, definitely. + +00:53:40 So Myst, I think on our community page, we have a notion of widgets, and then we have a notion of host platforms. + +00:53:50 And so the two primary host platforms right now are things that are Jupyter-based, so like VS Code, Notebooks, Google Colab. + +00:54:01 binder, like all these things that are based off of a Jupyter kernel and run Python behind the scenes. + +00:54:07 Then Marimo is another host platform and Mist is currently working on an integration + +00:54:12 around that specification for the front-end code. And so the idea is like as long as they have a way + +00:54:17 to, and I'm not sure how far along that is at the moment, but as long as there is a way to run Python + +00:54:23 code and sort of maintain that relationship between that front-end code and back-end code, then + +00:54:27 And you could drop anywidgets within your MIST markdown, for example, as well. + +00:54:31 Yeah. + +00:54:32 MIST is wild. + +00:54:33 I had that team on the podcast a while ago. + +00:54:35 It's like, you want to create an e-book based on notebooks or whatever, LaTeX paper, or + +00:54:42 you name it, right? + +00:54:42 Yeah. + +00:54:43 And yeah. + +00:54:45 I haven't done anything with Mosaic. + +00:54:46 Yeah, Mosaic. + +00:54:47 I think this is a really cool project. + +00:54:50 So at its core, Mosaic is a bit more of like an architecture. + +00:54:54 And this is probably one of the things that I'm most excited about, + +00:54:56 like the ability that anywidget is bringing to folks to experiment with new types of architectures + +00:55:02 for building high-performance visualizations. + +00:55:05 So inside a Mosaic, you have a notion of some database source + +00:55:10 that I believe is typically DuckDB. + +00:55:13 And then you write a bunch of front-end code that basically expresses all of the data that it needs + +00:55:17 as queries to that database. + +00:55:18 And then there's a lot of optimizations that can happen between those individual views and the database + +00:55:24 that can all be coordinated and allow for very scalable data visualizations. + +00:55:29 One cool thing about Mosaic is that this architecture lends itself very well to notebooks + +00:55:34 because you can keep that database, that DuckDB running completely in the kernel, + +00:55:39 and then you just have your front-end client. But then there's also an option where you just + +00:55:42 completely compile everything into WebAssembly, and then that DuckDB is running in the browser as + +00:55:48 well. And so you get to reuse a lot of this architecture and allows just a lot of code + +00:55:53 reuse and you're just kind of moving that lever of like okay do i need like high performance compute + +00:55:58 okay let me put my database in my hpc and then i'll just have a thin client on top of it versus + +00:56:03 okay actually i want to look at everything in the browser like maybe we can just compile to web + +00:56:07 assembly and let someone drop in a csv and then we can use reuse the same visualization so yeah + +00:56:12 this is super yeah that's super flexible duckdb has a very interesting WebAssembly story as well so + +00:56:18 yeah definitely it's definitely catching a lot of attention and people you know it's + +00:56:23 It's the data science SQLite, right? + +00:56:26 Yeah. + +00:56:27 And I'd say like a growing trend that I've seen by being able to, + +00:56:31 like one thing I didn't, that didn't fully, I wasn't fully aware of when I started on anywidget, + +00:56:35 but it has been fun to see is because it became easier to sort of start playing + +00:56:39 with like both front end code and Python code in the same environment. + +00:56:42 Like there've been a lot more, I think, like experimentations around, + +00:56:46 like trying out new types of architectures for building like visualizations. + +00:56:50 And so one trend that I've noticed in our community gallery + +00:56:53 is that a lot of our most popular widgets are taking advantage of new data formats like Apache Arrow + +00:57:00 to be able to send lots of data to the browser to put on the GPU and render very quickly. + +00:57:05 And so basically, there's a really cool example of a widget called Lawnboard, + +00:57:10 which is a geospatial visualization widget, which is a wrapper around a front-end library called DeckGL + +00:57:16 that was made by Uber to do geospatial visualization. + +00:57:20 And the previous Jupyter integration, like serialized data to JSON to be able to like render inside of DeckGL. + +00:57:27 So the moment that you had to render maybe a few million points, + +00:57:29 it's like you're spending a ton of time serializing that data + +00:57:31 to be able to put it in the browser. + +00:57:33 Right. + +00:57:34 It's like using an ORM. + +00:57:35 They're great until you try to do a query with 100,000 records. + +00:57:38 You're like, why is this so slow? + +00:57:39 Exactly. + +00:57:40 And so I think there's an example with Lawnboard where some data set with like 3.5 million rows just like crashed in the previous, + +00:57:48 or never rendered, maybe it took a few seconds to render, + +00:57:51 or a minute to render. + +00:57:53 But Lawnboard uses Apache Arrow and grabs that data from either Arrow, Parquet, + +00:57:59 or all these different file formats. + +00:58:00 And that's very fast to do, because you just copy that buffer and ship it to the front end + +00:58:04 and put it on the GPU. + +00:58:06 And so it can do that same example in, I think, one or two seconds on modern machines. + +00:58:11 And so it really is this, get your data into a data frame + +00:58:13 and dump it on the GPU and visualize it. + +00:58:16 And that whole idea of, oh, what format do I need to convert to to be able to use this tool, this visualization tool + +00:58:23 that I've heard? It's like, no, no, no, just load it as a data frame. And then as long as you get + +00:58:27 your data as a data frame, we can throw it into the web browser. Yeah, super neat. That's the kind + +00:58:33 of stuff that normal web people don't think about, right? You're like, well, how much is it to + +00:58:37 serialize a little JSON response? Or, you know, we're talking about a million of them. Oh. + +00:58:42 Yeah, yeah, exactly. And I guess one thing I've been impressed with is just at times where my + +00:58:48 understanding of how good computers have gotten is like, it's just shocking. It's more like, oh, + +00:58:54 I think we're holding ourselves back at times by maybe some of the standard practices versus like, + +00:59:00 yeah, hardware has gotten very good. And if you can make use of that hardware efficiently, + +00:59:04 there's quite a bit that you can do on very low powered devices and visualize a ton of data. + +00:59:10 Yeah. Quite a while ago, like really long time ago, I did some OpenGL programming and you see, + +00:59:15 you know, like here, we gave this, the scene a hundred thousand triangles and it's rendering + +00:59:19 at 200 frames a second. You're like, how, how many operations is that? Like, that is insane. You know, + +00:59:25 this is like 30 years ago. You were just like, that just blows my mind. Yeah. I think people + +00:59:29 underestimate what computers can do sometimes, especially if you work with them, right? You're + +00:59:32 not doing three levels of serialization and deserialization and, and so on. Yeah. Yeah, + +00:59:37 exactly. That's wild. So what about publishing one of these things? If we go and write, + +00:59:42 I've created an AnyWidget and it's amazing for me to use, but I want to put it on PyPI so people can UVPIP install it or whatever in their notebook. + +00:59:51 Yeah. + +00:59:52 So I have a couple. + +00:59:53 One, I would recommend if folks are really interested in it, there's a couple of videos I have on my YouTube of just like publishing some widgets and they're linked in the AnyWidget documentation. + +01:00:03 But that file that you see there, that's open in the current view that we have. + +01:00:09 If you know how to publish a Python file, you can just publish that to PyPI. + +01:00:12 and it should just work. + +01:00:13 So yeah, the one caveat I'd have is like, you're looking at an inline string here, + +01:00:18 and that's not always the nicest way to write your front end code. + +01:00:22 So you can also just import, like you can require that as a file instead. + +01:00:26 And so you can put that code into a separate file, type check that or like lint it or like format it + +01:00:31 as JavaScript. + +01:00:32 - So write a JavaScript file, and then somehow set just, + +01:00:36 would I just read that in Python? + +01:00:39 And then just set the text result, like use path and say read text and then jam it in there? + +01:00:43 Yeah, we support if you just pass a path, lib path to ESM, we'll read it for you. + +01:00:49 Oh, okay, perfect, yeah. + +01:00:51 So as long as I have a package and the package points out that these JavaScript files + +01:00:56 are to be included at build, then you're good to go? + +01:00:59 Yep, and our starter kit template is all configured with Hatchling to do that for you. + +01:01:04 Yeah, very nice. + +01:01:05 Okay, and I've been using uv build, uv publish these days, + +01:01:10 and it's pretty seamless. + +01:01:12 Yeah, yeah. + +01:01:12 I think maybe we might need to update the build script to have uv build as the backend. + +01:01:17 So, yeah. + +01:01:17 Yeah, even if you do uv build against something with a different backend, + +01:01:20 I think uv just runs that backend instead. + +01:01:22 So not a big deal. + +01:01:24 Not a big deal. + +01:01:25 So what are maybe some of the rough edges that people need to look out for these days? + +01:01:30 It's got a lot of stuff that makes things easy, but what should they be maybe on the lookout for? + +01:01:35 I think the thing that if folks are getting, Hopefully if you want to try out a widget, it should be as simple as installing any other Python package. + +01:01:44 And that is really a goal that we've had for the project, is that if you want to use this like you would some algorithm library that just does computation, + +01:01:52 we want that to be as simple as pip installing and getting started. + +01:01:57 I would say probably the highest barrier to entry is that it is just this idea of just enough JavaScript. + +01:02:04 And so one thing that I try to emphasize in the docs is that I'd really start with a very + +01:02:10 simple example if you've never played around in the browser and get used to opening up + +01:02:15 your developer tools and logging things and just understanding how to develop maybe a + +01:02:20 little bit in the front end. + +01:02:22 If you're coming from the front end side, then it would be learning a little bit more + +01:02:25 of how does Python work and how do I debug things on the Python side. + +01:02:29 And what is PyPI and how do I even work with that thing? + +01:02:32 Exactly. So it's really, I'd say, most of the rough edges are just these problems of working + +01:02:37 with two language ecosystems. But that's also been a very important part of AnyWidget because we don't + +01:02:41 want people that are ambitious to build things that sort of bridge this gap to feel restricted + +01:02:46 from either using things in the Python ecosystem or the web ecosystem. Because we could always write + +01:02:51 wrappers to make things easier for authors, but then we'll probably get a bunch of issues that + +01:02:56 then say, hey, I want to use this API and you haven't exposed it. And they're like, hey, we'd like to use + +01:03:00 view, you're like, well, you use our Python thing that defines the DOM. + +01:03:03 Like, oh, okay. + +01:03:04 Yeah, exactly. + +01:03:05 So with great power comes great responsibility, but we want to give that to people that want + +01:03:11 to build these types of integrations. + +01:03:12 Yeah. + +01:03:13 So what about agentic AI for this kind of stuff? + +01:03:16 I know five years ago, you have to learn every step of the JavaScript. + +01:03:20 How much can I say, take my inline JavaScript, put it into a JS file, and then point it at, + +01:03:26 you know, ask Claude Sonnet, hey, this is an AnyWidget widget, and this is the JavaScript. + +01:03:30 and here's what I'm trying to accomplish. + +01:03:31 I bet you could go pretty far with a marginal understanding of JavaScript + +01:03:36 as a Python person, plus some kind of AI. + +01:03:39 What do you think? + +01:03:39 Oh, yeah, absolutely. + +01:03:41 I think, like I mentioned earlier, Vincent works and I work very closely together. + +01:03:46 And his joke that he likes to tell is that I created anywidgets + +01:03:49 so that he could make anywidgets. + +01:03:53 And the ability to vibe code sort of the front end for an object + +01:03:58 really is eye-opening if you've never tried it. + +01:04:00 before if you're working with your data. + +01:04:02 And I think one of the barriers maybe of trying out or of thinking about how to use anywidget + +01:04:09 is knowing when it might be useful. + +01:04:11 And really, it's like anytime you have an object that you might want to visualize in a particular way + +01:04:16 or maybe extend with some extra capabilities in your notebook environment, + +01:04:21 that's a perfect time to open up some agentic tool and start trying to explain what you're thinking about. + +01:04:29 because it's just JavaScript and not like a custom framework that we wrote or, and it's not like, + +01:04:35 they're very good at like writing this type of code. And so sometimes you might need to, + +01:04:39 we have a set of rules to try to like help it, make sure it doesn't like try to do React or + +01:04:43 something, but it can go very far. And because it's just like a view on top of like your data, + +01:04:48 it's like, it's kind of the perfect thing to play around with vibe coding, I think is. + +01:04:52 It's incredible how far you can go with this stuff. You can, you can give it your data source + +01:04:56 and say, here's the data source, here's what I'm trying to do. + +01:04:58 And because the data sources often can be really foundational like CSV or whatever. + +01:05:02 And then importantly, because it's just pure JavaScript, + +01:05:06 AI goes crazy on pure JavaScript and pure Python, right? + +01:05:10 It's unleashed. + +01:05:11 Yeah, it's crazy. + +01:05:12 So to plug it back to Marimo for a second, I think because it's just Python and just JavaScript, + +01:05:19 it's a really fun environment to actually create anywidgets in. + +01:05:22 And we have some integrations with like AI chat within our editor where we actually will inspect some of your Python variables and include that + +01:05:32 in the prompts as you're trying to work on things. + +01:05:34 So if you're saying, "Hey, I want to make a widget and I want to visualize this data + +01:05:38 frame," you can actually tag the object that you want to start programming around. + +01:05:43 And that will include things like the schema and stuff as it gives it to the model, such + +01:05:46 that then you get that sort of grounds it at least when it's writing that JavaScript code + +01:05:50 to grab the right columns and things off the data set and get to the ground running without + +01:05:54 you having to procure those things as well. + +01:05:56 So it's a pretty fun environment to create these things. + +01:05:59 It's a weird time, isn't it, that we live in? + +01:06:01 Kind of magical and also scary, but just weird. + +01:06:04 Yeah, definitely. + +01:06:05 But very much fun as well. + +01:06:08 I would throw out there, if people are interested, check out the Matt Makai episode I did on HHC programming + +01:06:14 three or four episodes ago. + +01:06:15 And choose a good model. + +01:06:16 I find people are like, I tried this. + +01:06:18 It did a bunch of junk. + +01:06:19 It's like, yeah, but use the free tier, right? + +01:06:20 I see. + +01:06:22 There's a big difference between the top tier models and the cheaper ones. + +01:06:26 So, all right. Where are we going? What's the roadmap look like for anywidget? + +01:06:31 Yeah. So the roadmap for anywidget, I think looks a lot like trying to iron, like trying to get users to, or authors to understand some of these patterns that are like emerging in the ecosystem for like building some of these, maybe more like high performance, like visualization tools. + +01:06:46 In terms of the library itself, there may be a few features that we want to add for users. + +01:06:51 But I think one of the most important things is that we just ensure that we stay backwards compatible now that we have something that people are building around. + +01:07:01 And so my call to action is just to try to get more folks, if they are curious, to try out building new widgets and let's keep this ecosystem pretty healthy. + +01:07:12 And then on top of that, if there are places where folks are like running into limitations of like the specification as is, then getting like invested parties either from, you know, the implementers to try to understand like what APIs we need to add to support things in the future. + +01:07:26 So, yeah. + +01:07:27 Amazing. + +01:07:28 It was up my mind, but I forgot to ask it when we were talking about this, where things run. + +01:07:33 If this is running in the kernel and you're doing HPC type stuff, I could potentially use Rust or C++ as part of building my widget. + +01:07:40 Is that right? + +01:07:41 Yeah, definitely. So one pattern we've seen is like, you know, you just try to do as little as you like just the interactive bits that you want in the front end. And then you have the full resources of anything that you can do in Python on the on the back end side, right. And because Python is such a language that has sort of grown to be able to be extended with all these like systems level programming languages, there's like a lot that you can tap into there in terms of like, yeah, high performance compute or like. + +01:08:07 So my hope is that really we push user or push anywidget authors of creating like interesting interactive elements that then allow us to really like have that extra input into these like really scalable like ecosystem, data ecosystem that we have on the Python side. + +01:08:23 Awesome. + +01:08:24 PR is accepted? + +01:08:25 Absolutely. + +01:08:26 Yeah. + +01:08:26 Yeah. + +01:08:27 Yeah. + +01:08:27 Okay. + +01:08:28 So yeah, people can check it out here on GitHub. + +01:08:30 Of course, we'll link to that. + +01:08:31 And yeah, you got ideas. + +01:08:33 All right. + +01:08:33 Get in there and add it. + +01:08:35 Yeah. + +01:08:35 And we also have a Discord. + +01:08:36 So folks, like you open up a notebook and you run into something or you just, you have an idea, + +01:08:42 like there's plenty of folks or I'm in there. + +01:08:45 And I love to try to get people for their first bit of JavaScript ever. + +01:08:50 I think it's pretty fun when someone has an idea and you just, we help them get there. + +01:08:54 So like, but it's JavaScript. + +01:08:56 Like, no, no, you're going to be okay. + +01:08:57 It's not that much. + +01:08:57 It's just enough JavaScript, right? + +01:08:59 Exactly. + +01:09:00 Whatever that means to you. + +01:09:02 Exactly. + +01:09:03 Just enough is so that my thing works. + +01:09:05 All right. + +01:09:05 Trevor, thank you so much for being on the show. + +01:09:07 I really appreciate it. + +01:09:08 And congrats on anywidget. + +01:09:10 Looks super cool. + +01:09:12 It definitely looks like it's all of any need and easy to work with. + +01:09:14 Yeah, thanks so much for having me. + +01:09:16 Yeah, you bet. + +01:09:17 See you later. + +01:09:17 See ya. + +01:09:19 This has been another episode of Talk Python To Me. + +01:09:22 Thank you to our sponsors. + +01:09:23 Be sure to check out what they're offering. + +01:09:24 It really helps support the show. + +01:09:26 Look into the future and see bugs before they make it to production. + +01:09:30 Sentry's Seer AI code review uses historical error and performance information + +01:09:34 at Sentry to find and flag bugs in your PRs before you even start to review them. Stop bugs before + +01:09:41 they enter your code base. Get started at talkpython.fm/seer-code-review. And this + +01:09:47 episode is sponsored by JetBrains and the PyCharm team. This week only through December 19th, 2025, + +01:09:54 get 30% off of PyCharm Pro, including renewals, and PyCharm will donate all the proceeds to the + +01:10:00 Python Software Foundation. Support the PSF by getting or renewing PyCharm. Visit talkpython.fm + +01:10:06 slash pycharm dash PSF dash 2025 and use the code strongerpython. Both of these are in your podcast + +01:10:13 player show notes. If you or your team needs to learn Python, we have over 270 hours of beginner + +01:10:18 and advanced courses on topics ranging from complete beginners to async code, Flask, Django, + +01:10:24 HTML, HTML, HTML, HTML, HTML, HTML, HTML, HTML, HTML, HTML, HTML, HTML, HTML, HTML, HTML, HTML. + +01:10:31 And if you're not already subscribed to the show on your favorite podcast player, + +01:10:35 what are you waiting for? + +01:11:02 I'm out. + diff --git a/transcripts/530-anywidget.vtt b/transcripts/530-anywidget.vtt new file mode 100644 index 0000000..dd54695 --- /dev/null +++ b/transcripts/530-anywidget.vtt @@ -0,0 +1,3941 @@ +WEBVTT + +00:00:00.020 --> 00:00:06.680 +For years, building interactive widgets in Python notebooks meant wrestling with toolchains, platform quirks, and a mountain of JavaScript machinery. + +00:00:07.400 --> 00:00:10.260 +Most developers took one look and backed away slowly. + +00:00:10.860 --> 00:00:13.540 +Trevor Mance decided that barrier did not need to exist. + +00:00:13.920 --> 00:00:15.000 +His idea was simple. + +00:00:15.540 --> 00:00:21.920 +Give Python users just enough JavaScript to unlock the web's interactivity without dragging along the rest of the web ecosystem. + +00:00:22.600 --> 00:00:28.840 +That idea became anywidget, and it's quickly becoming the quiet, connective tissue of modern interactive computing. + +00:00:29.660 --> 00:00:34.840 +Today, we dig into how it works, why it's taken off, and how it might change the way we explore data. + +00:00:35.520 --> 00:00:40.500 +This is Talk Python To Me, episode 530, recorded November 25th, 2025. + +00:00:57.620 --> 00:00:59.020 +Welcome to Talk Python To Me, + +00:00:59.140 --> 00:01:02.380 +the number one Python podcast for developers and data scientists. + +00:01:03.080 --> 00:01:04.300 +This is your host, Michael Kennedy. + +00:01:04.680 --> 00:01:08.220 +I'm a PSF fellow who's been coding for over 25 years. + +00:01:08.860 --> 00:01:09.920 +Let's connect on social media. + +00:01:10.300 --> 00:01:13.380 +You'll find me and Talk Python on Mastodon, Bluesky, and X. + +00:01:13.740 --> 00:01:15.560 +The social links are all in your show notes. + +00:01:16.340 --> 00:01:19.840 +You can find over 10 years of past episodes at talkpython.fm. + +00:01:20.020 --> 00:01:23.220 +And if you want to be part of the show, you can join our recording live streams. + +00:01:23.500 --> 00:01:24.060 +That's right. + +00:01:24.340 --> 00:01:27.500 +We live stream the raw uncut version of each episode on YouTube. + +00:01:27.860 --> 00:01:32.520 +Just visit talkpython.fm/youtube to see the schedule of upcoming events. + +00:01:32.730 --> 00:01:36.400 +Be sure to subscribe there and press the bell so you'll get notified anytime we're recording. + +00:01:37.240 --> 00:01:40.040 +Look into the future and see bugs before they make it to production. + +00:01:40.780 --> 00:01:46.140 +Sentry's Seer AI Code Review uses historical error and performance information at Sentry + +00:01:46.510 --> 00:01:50.460 +to find and flag bugs in your PRs before you even start to review them. + +00:01:51.140 --> 00:01:52.960 +Stop bugs before they enter your code base. + +00:01:53.460 --> 00:01:57.340 +Get started at talkpython.fm/seer-code-review. + +00:01:57.920 --> 00:02:00.860 +And this episode is sponsored by JetBrains and the PyCharm team. + +00:02:01.420 --> 00:02:07.920 +This week only through December 19th, 2025, get 30% off of PyCharm Pro, including renewals, + +00:02:08.380 --> 00:02:12.280 +and PyCharm will donate all the proceeds to the Python Software Foundation. + +00:02:12.840 --> 00:02:15.540 +Support the PSF by getting or renewing PyCharm. + +00:02:15.980 --> 00:02:22.040 +Visit talkpython.fm/pycharm dash PSF dash 2025 and use the code STRONGERPYTHON. + +00:02:22.700 --> 00:02:24.540 +Both of these are in your podcast player show notes. + +00:02:25.540 --> 00:02:28.640 +Trevor, welcome to Talk By Than Me. Awesome to have you here. Thanks for coming. + +00:02:29.040 --> 00:02:30.000 +Yeah, thanks for having me, Michael. + +00:02:30.440 --> 00:02:36.160 +We're going to talk about building Lego blocks for data scientists. Is that maybe a good way to put it? + +00:02:36.500 --> 00:02:37.040 +Yeah, definitely. + +00:02:37.620 --> 00:02:47.800 +Yeah, so we're going to talk about your project, AnyWidget, which is a super cool project that allows people to build more adaptable, more reusable widgets, + +00:02:48.100 --> 00:02:52.160 +rather than just saying I'm going to build it for Jupyter or Remo or whatever. + +00:02:52.360 --> 00:02:57.060 +but I'm going to build something that can be used broadly through your platform and a lot simpler + +00:02:57.800 --> 00:03:02.180 +deployment as well. So I know a lot of people out there are excited. Not everyone necessarily wants + +00:03:02.180 --> 00:03:07.220 +to build widgets, but this also makes more widgets available for people, right? So super cool project. + +00:03:07.640 --> 00:03:11.820 +Yeah. And before we get into that, of course, maybe just tell us a bit about yourself. + +00:03:12.220 --> 00:03:18.439 +Yeah. So I'm Trevor. I'm a software engineer at Marimo, but prior to being at Marimo, I was + +00:03:18.680 --> 00:03:25.300 +doing my PhD in data visualization at Harvard Medical School. So I got into this whole like + +00:03:25.460 --> 00:03:31.980 +Python web world from working on tools for biological scientists to explore like large + +00:03:32.400 --> 00:03:37.600 +genomics or spatial genomic data sets. That is a super interesting topic. I'm sure + +00:03:38.260 --> 00:03:43.600 +there were some really niche things you had to build, right? Like, you know, we have this crazy + +00:03:43.940 --> 00:03:48.220 +research machine. There's three of them in the world and here's the data that comes out of it. + +00:03:48.260 --> 00:03:51.200 +how do I put that into notebooks or whatever, right? + +00:03:51.860 --> 00:03:53.040 +Yeah, yeah, exactly. + +00:03:53.300 --> 00:03:54.420 +And it's funny you mentioned that. + +00:03:55.020 --> 00:03:57.560 +I think starting off in more like the visualization side, + +00:03:57.720 --> 00:04:00.520 +we'd build these very bespoke and custom applications + +00:04:00.820 --> 00:04:03.080 +for looking at large biological data sets. + +00:04:03.660 --> 00:04:05.760 +And then I think a motivation for me was realizing + +00:04:05.880 --> 00:04:07.200 +a lot of the folks that I was working with + +00:04:07.340 --> 00:04:10.160 +to build these tools for worked in Python + +00:04:10.360 --> 00:04:11.680 +and worked in notebooks specifically. + +00:04:11.880 --> 00:04:14.360 +And so there was sort of this gap between the research + +00:04:14.460 --> 00:04:16.100 +that we were doing to build these specialized tools + +00:04:16.280 --> 00:04:18.220 +and trying to meet users where they were at + +00:04:18.239 --> 00:04:20.320 +to actually use those tools in their research + +00:04:20.640 --> 00:04:22.560 +and as a part of their data exploration workflows. + +00:04:23.060 --> 00:04:24.300 +- And this was your PhD work? + +00:04:24.680 --> 00:04:26.180 +- Yes, yeah, this was my PhD work. + +00:04:26.440 --> 00:04:26.920 +- Incredible. + +00:04:27.380 --> 00:04:28.980 +So I'm sure it wasn't just Python. + +00:04:30.240 --> 00:04:31.120 +- No, so-- + +00:04:31.120 --> 00:04:32.400 +- What was involved with it? + +00:04:32.760 --> 00:04:36.240 +- So yeah, I'd say the research community, + +00:04:36.350 --> 00:04:38.860 +at least in bio, is sort of split between R and Python, + +00:04:39.260 --> 00:04:41.540 +but specifically some of our closest collaborators + +00:04:41.760 --> 00:04:42.800 +were in the Python ecosystem. + +00:04:43.620 --> 00:04:46.920 +And then my team was building a lot of user interfaces + +00:04:46.940 --> 00:04:49.560 +for visual exploration of data sets. + +00:04:49.740 --> 00:04:52.660 +And I think this notebook sort of became this space + +00:04:52.760 --> 00:04:54.020 +and specifically Python notebooks + +00:04:54.240 --> 00:04:56.820 +where you can really blend those programmatic + +00:04:57.060 --> 00:04:58.160 +and user interfaces together. + +00:04:59.020 --> 00:05:01.500 +That's sort of why I centered on trying to solve + +00:05:01.700 --> 00:05:03.280 +some of the rough edges + +00:05:03.560 --> 00:05:06.560 +about bringing these interactive tools inside of notebooks. + +00:05:07.160 --> 00:05:07.820 +- That's cool. + +00:05:08.180 --> 00:05:11.140 +Did you do any other platforms? + +00:05:11.520 --> 00:05:15.400 +I'm thinking like Unity or Reel, + +00:05:16.180 --> 00:05:19.760 +a real engine for like flying through data or what else did you all do? + +00:05:20.380 --> 00:05:25.240 +Yeah, there were some folks on my team that ended up or are currently working in more like spatial, + +00:05:25.680 --> 00:05:27.480 +like AR, VR type scenarios. + +00:05:27.600 --> 00:05:29.580 +And that I think is based on the Unity engine. + +00:05:30.120 --> 00:05:34.960 +Most of my work was like web-based and we, and specifically, and we'll probably get into it. + +00:05:34.960 --> 00:05:38.940 +I think the web is just a really important platform to develop for in the sense that, + +00:05:39.820 --> 00:05:41.620 +I mean, we're recording right now in a web browser. + +00:05:42.120 --> 00:05:47.040 +It's a very capable platform that everyone sort of has an entry point to on some sort of device. + +00:05:47.440 --> 00:05:55.360 +And so developing for the web often means that you can reuse a lot of that application amongst different devices and in different contexts. + +00:05:56.220 --> 00:05:58.240 +It's unbelievable what the web can do these days. + +00:05:58.300 --> 00:06:08.460 +I made the comment before we started, like, I can't believe this is a web browser that we're building, that we're doing this live streaming, video sharing, local recording. + +00:06:08.520 --> 00:06:11.020 +All this stuff is just like, it's so wild. + +00:06:11.340 --> 00:06:14.760 +And I built an app recently that I'm using to kind of help out with some of the interview + +00:06:14.940 --> 00:06:15.020 +stuff. + +00:06:15.180 --> 00:06:15.880 +And it's on the web. + +00:06:15.950 --> 00:06:16.900 +It just runs on my iPad. + +00:06:17.220 --> 00:06:22.560 +I mean, it goes underappreciated some of the time, I think, of just how far you can push + +00:06:22.660 --> 00:06:22.740 +it. + +00:06:22.840 --> 00:06:25.020 +It's not just documents and Ajax. + +00:06:25.380 --> 00:06:25.600 +Yeah. + +00:06:25.830 --> 00:06:25.960 +Yeah. + +00:06:26.160 --> 00:06:29.400 +The fact that a version of Photoshop runs in the browser is pretty amazing. + +00:06:30.940 --> 00:06:33.340 +And Figma itself is all running in the web browser. + +00:06:33.580 --> 00:06:35.320 +So it's definitely a capable platform. + +00:06:35.620 --> 00:06:40.100 +But one thing it's not very good at is, or doesn't have the ecosystem for, is a lot of + +00:06:40.240 --> 00:06:40.580 +data work. + +00:06:40.940 --> 00:06:42.400 +And I think that that's really the big divide. + +00:06:42.840 --> 00:06:44.680 +So I see the web is really like + +00:06:44.960 --> 00:06:46.580 +one of the most accessible platforms + +00:06:46.880 --> 00:06:49.880 +for building like user interfaces and like applications. + +00:06:50.520 --> 00:06:52.020 +But then there's this whole other world, + +00:06:52.200 --> 00:06:54.840 +which is like tools that you actually do data science stuff in. + +00:06:54.900 --> 00:06:56.140 +And that is not the web whatsoever, + +00:06:56.600 --> 00:06:58.480 +but has like a similar amount of maturity + +00:06:58.780 --> 00:07:00.660 +in terms of being very ergonomic and good + +00:07:00.740 --> 00:07:01.800 +at doing that type of work. + +00:07:02.180 --> 00:07:05.840 +- Right, and those two worlds meet in, + +00:07:06.860 --> 00:07:08.560 +they meet in the web, right? + +00:07:08.720 --> 00:07:13.240 +But the people doing the Python work, they aren't necessarily web developers. + +00:07:13.860 --> 00:07:18.660 +You gave a talk a year ago about anywidget that I'll definitely link to. + +00:07:18.960 --> 00:07:25.060 +And you have this graphic here like, okay, we've got the front end, which often is Jupyter. + +00:07:25.200 --> 00:07:26.740 +Not always, as we'll see in a minute. + +00:07:27.020 --> 00:07:31.780 +But often is Jupyter with the way you build for it is with all these web front end tools + +00:07:32.240 --> 00:07:36.460 +that even me doing web development is a lot of times I'm like, what is this? + +00:07:36.460 --> 00:07:37.600 +And why is it so complicated? + +00:07:38.640 --> 00:07:39.600 +I just don't even know. + +00:07:40.240 --> 00:07:41.600 +And I do web stuff all the time. + +00:07:42.060 --> 00:07:43.260 +And then you've got the backend stuff + +00:07:43.420 --> 00:07:45.700 +where Python people live doing NumPy, + +00:07:46.080 --> 00:07:47.880 +pollers, AppLotlib, et cetera. + +00:07:48.280 --> 00:07:50.560 +Do you want to riff on that challenge a little bit? + +00:07:50.640 --> 00:07:52.260 +Because I know your project is really here + +00:07:52.290 --> 00:07:54.640 +to kind of solve some of this disjointedness. + +00:07:54.940 --> 00:07:57.340 +Yeah, I think one interesting thing, + +00:07:57.510 --> 00:07:58.420 +like taking a step back, + +00:07:58.490 --> 00:08:00.840 +and I mentioned this in the talk that I gave a while back, + +00:08:01.060 --> 00:08:03.100 +but like the web and Python ecosystems + +00:08:03.320 --> 00:08:05.780 +have been around for almost the same exact amount of time. + +00:08:06.120 --> 00:08:07.339 +Like I think like the first webpage + +00:08:07.340 --> 00:08:10.780 +was around the same year that the first release of Python came out. + +00:08:10.880 --> 00:08:14.680 +And so they sort of have been evolving in their own respective, + +00:08:15.040 --> 00:08:17.980 +or developing their own respective ecosystems for that period of time. + +00:08:18.400 --> 00:08:22.120 +And then as the web became really this platform for building user interfaces, + +00:08:22.560 --> 00:08:27.340 +I think that some pretty visionary folks saw that you could connect + +00:08:28.160 --> 00:08:31.540 +these backend resources to these very interactive applications + +00:08:31.960 --> 00:08:33.479 +that could be developed very quickly. + +00:08:34.419 --> 00:08:36.799 +So over the years, these things that started off very far apart + +00:08:36.820 --> 00:08:38.120 +have now come very close together. + +00:08:38.979 --> 00:08:41.520 +And now with WebAssembly, sometimes it's just all the, + +00:08:41.909 --> 00:08:43.000 +like turtles all the way down, + +00:08:43.159 --> 00:08:44.360 +it's all actually running in the browser. + +00:08:44.760 --> 00:08:46.220 +So it's pretty amazing. + +00:08:46.700 --> 00:08:48.900 +But I think that there definitely is this friction. + +00:08:49.240 --> 00:08:51.800 +And one thing I've also noticed from being in both these, + +00:08:51.970 --> 00:08:54.040 +the web and Python ecosystems for so long + +00:08:54.220 --> 00:08:56.180 +is sort of these like ebbs and flows of like maturity + +00:08:56.560 --> 00:08:59.320 +or like places that need to catch up to one another. + +00:08:59.450 --> 00:09:02.940 +And so I came from definitely like the website originally. + +00:09:03.070 --> 00:09:05.560 +And there's a lot that I took for granted, I guess, + +00:09:05.640 --> 00:09:09.360 +and that ecosystem of tooling that for a while, I think, + +00:09:09.560 --> 00:09:11.560 +gave the language a very bad rap + +00:09:11.800 --> 00:09:13.740 +because it was sort of the scripting language that was invented, + +00:09:13.920 --> 00:09:15.780 +I think, famously in two weeks. + +00:09:16.480 --> 00:09:18.800 +And now people are trying to build applications in it. + +00:09:18.940 --> 00:09:23.200 +But over the years, there have been standards and things have matured. + +00:09:23.200 --> 00:09:25.280 +And actually, the web is very backwards compatible + +00:09:25.420 --> 00:09:28.320 +and has some of these properties that are kind of amazing + +00:09:28.680 --> 00:09:30.080 +for something that's been around for that long. + +00:09:30.720 --> 00:09:32.300 +And then I'd say in the last couple of years, + +00:09:32.680 --> 00:09:39.780 +we've seen a bunch of new developer tool work going into the Python ecosystem that in some part + +00:09:39.820 --> 00:09:44.060 +is also inspired by some of the tooling in the web ecosystem. So I love to see how these things + +00:09:44.100 --> 00:09:48.820 +go back and forth. And I think if you haven't checked in on the ecosystem for a while, + +00:09:49.020 --> 00:09:55.180 +you might think it isn't what it was five years ago as it is today. And one of the goals of AnyWidget + +00:09:55.280 --> 00:10:01.699 +is to try to demystify some of maybe those prior bad experiences folks had developing and just + +00:10:01.860 --> 00:10:07.120 +you know, get back to like opening up some HTML and writing some JavaScript and getting your Python + +00:10:07.320 --> 00:10:13.500 +data in the browser. Yeah, absolutely. You know, I wonder, as you describe it being made in two + +00:10:13.720 --> 00:10:21.060 +weeks, which I also think this is true, when it was made, I think, at Mozilla, right? Yeah. Anyway, + +00:10:21.660 --> 00:10:28.219 +one of the JavaScript's real issues is it doesn't have a real numerical system, right? Everything is + +00:10:28.240 --> 00:10:32.780 +afloat. That has a lot of problems for data science when it really, really has to be integers + +00:10:33.400 --> 00:10:39.000 +or whatever, right? Like you need more control over the numbers. I wonder how much of Python's + +00:10:39.260 --> 00:10:43.500 +benefit in the data science space originates from people going, JavaScript's super popular, + +00:10:43.960 --> 00:10:48.120 +but can't do numbers there. I do science, can't do JavaScript. What am I doing now? You know what + +00:10:48.120 --> 00:10:52.260 +I mean? What do you think about that? It just occurred to me. Yeah, it's interesting. Like that + +00:10:52.520 --> 00:10:58.200 +is originally like a constraint of the language. And maybe, you know, when you're picking ecosystems + +00:10:58.220 --> 00:11:04.600 +on. Also, there was no server way to run JavaScript until 2010. So the idea of just running this + +00:11:04.840 --> 00:11:11.660 +without a browser was also not very good. So yeah, I'm not quite sure, but it's funny because + +00:11:11.900 --> 00:11:17.660 +JavaScript has sort of reacted to that by adding some big ints as a data type. But also now there's + +00:11:17.800 --> 00:11:21.860 +sort of like a fourth language for the web, which is WebAssembly, and there are proper data types + +00:11:22.220 --> 00:11:28.180 +in WebAssembly. So I guess it is sort of reactionary, but I haven't really seen. So now you can run + +00:11:28.200 --> 00:11:33.600 +Python in the browser. And, but you'd have to have all this like ecosystem grift or migration that I + +00:11:33.940 --> 00:11:38.220 +just don't think anybody wants. So it's more, maybe these ecosystems can play a lot nicer together. + +00:11:39.160 --> 00:11:44.100 +Yeah. Yeah. And certainly they are with things like Jupyter and stuff, right? Like most users + +00:11:44.280 --> 00:11:48.780 +of Jupyter don't write JavaScript. Yes. And I think that everyone's happy about that. So + +00:11:49.839 --> 00:11:52.460 +exactly. That's true. There's a lot of people are like, you know what, that's not for me. + +00:11:52.600 --> 00:11:58.360 +So, you know, you were at Harvard doing your work and now you're at Marimo? + +00:11:58.720 --> 00:11:59.020 +Marimo? + +00:11:59.420 --> 00:12:00.780 +I always mess up saying this. + +00:12:01.339 --> 00:12:02.360 +Tell us how to do it. + +00:12:02.560 --> 00:12:03.140 +It's Marimo. + +00:12:03.860 --> 00:12:04.260 +Marimo. + +00:12:04.420 --> 00:12:05.020 +Okay, Marimo. + +00:12:05.600 --> 00:12:05.680 +Yeah. + +00:12:05.960 --> 00:12:09.840 +And I'd actually on, and I know we talked a lot about it and he set me straight and then + +00:12:10.060 --> 00:12:10.900 +I drifted, I'm sure. + +00:12:11.300 --> 00:12:14.500 +So what's your experiences here? + +00:12:14.600 --> 00:12:17.560 +Like, how's it coming from academia to this world? + +00:12:17.960 --> 00:12:18.560 +It's been great. + +00:12:18.720 --> 00:12:25.380 +I think that having worked with, so part of AnyWidget was trying to address some user needs working inside of Notebooks. + +00:12:25.540 --> 00:12:32.160 +And there's only so much that you can do as a plugin to get that extra bit to help users. + +00:12:32.320 --> 00:12:37.500 +And so I think one really exciting thing about working on this team with Marimo is they got very involved, + +00:12:37.600 --> 00:12:42.880 +or they were aware of AnyWidget a while ago, and they kind of actually legitimized AnyWidget as, + +00:12:43.420 --> 00:12:48.240 +hey, this is a standard because we were able to implement around this specification. + +00:12:48.740 --> 00:12:50.220 +And so that got us together. + +00:12:50.410 --> 00:13:00.000 +And then from there, it's just been super exciting to work with a team that is so passionate about notebooks and thinking about this next generation of notebooks and what that environment needs to look like. + +00:13:00.030 --> 00:13:07.540 +And so there's much more that we can do inside of Marimo beyond the AnyWidget specification, but also AnyWidget is an important component of that. + +00:13:07.800 --> 00:13:10.780 +So I like to be able to sort of think about both of these two worlds. + +00:13:11.140 --> 00:13:11.560 +That's cool. + +00:13:11.610 --> 00:13:17.040 +So when I use widgets in Marimo, it's often an AnyWidget? + +00:13:17.280 --> 00:13:40.180 +Yep. Yeah, exactly. So if when folks usually come and they want to like extend Marimo with their own custom UI elements, like our answer is you should make an AnyWidget. So that's pretty exciting. But then if you're using things in like the sidebar or things that like sort of integrate outside of like the notebook view, those aren't more custom elements that we've created at Marimo that aren't necessarily based on AnyWidget. + +00:13:42.140 --> 00:13:45.100 +This portion of Talk Python To Me is brought to you by Sentry. + +00:13:46.400 --> 00:13:47.280 +Let me ask you a question. + +00:13:48.060 --> 00:13:49.660 +What if you could see into the future? + +00:13:50.420 --> 00:13:51.680 +We're talking about Sentry, of course, + +00:13:51.940 --> 00:13:56.280 +so that means seeing potential errors, crashes, and bugs before they happen, + +00:13:56.720 --> 00:13:58.500 +before you even accept them into your code base. + +00:13:59.140 --> 00:14:02.280 +That's what Sentry's AI Sears Code Review offers. + +00:14:03.080 --> 00:14:06.420 +You get error prediction based on real production history. + +00:14:06.960 --> 00:14:10.460 +AI Sear Code Review flags the most impactful errors + +00:14:10.520 --> 00:14:16.420 +your PR is likely to introduce before merge using your app's error and performance context, + +00:14:16.860 --> 00:14:18.760 +not just generic LLM pattern matching. + +00:14:19.680 --> 00:14:24.660 +Seer will then jump in on new PRs with feedback and warning if it finds any potential issues. + +00:14:25.460 --> 00:14:26.240 +Here's a real example. + +00:14:26.940 --> 00:14:29.720 +On a new PR related to a search feature in a web app, + +00:14:29.860 --> 00:14:34.320 +we see a comment from Seer by Sentry bot in the PR. + +00:14:34.960 --> 00:14:35.420 +And it says, + +00:14:35.520 --> 00:14:38.600 +potential bug, the process search results function, + +00:14:38.860 --> 00:14:40.580 +can enter an infinite recursion + +00:14:40.940 --> 00:14:42.600 +when a search query finds no matches, + +00:14:43.380 --> 00:14:45.740 +as the recursive call lacks a return statement + +00:14:45.940 --> 00:14:47.580 +and a proper termination condition. + +00:14:48.140 --> 00:14:51.440 +And Seer AI Code Review also provides additional details + +00:14:51.780 --> 00:14:53.860 +which you can expand for further information + +00:14:54.180 --> 00:14:55.780 +on the issue and suggested fixes. + +00:14:56.460 --> 00:14:57.800 +And bam, just like that, + +00:14:57.920 --> 00:15:00.600 +Seer AI Code Review has stopped a bug in its tracks + +00:15:00.960 --> 00:15:02.280 +without any devs in the loop. + +00:15:02.680 --> 00:15:05.500 +A nasty infinite loop bug never made it into production. + +00:15:06.220 --> 00:15:06.940 +Here's how you set it up. + +00:15:07.400 --> 00:15:10.720 +You enable the GitHub Sentry integration on your Sentry account. + +00:15:11.160 --> 00:15:13.320 +Enable Seer AI on your Sentry account. + +00:15:13.820 --> 00:15:16.640 +And on GitHub, you install the Seer by Sentry app + +00:15:16.720 --> 00:15:19.380 +and connect it to your repositories that you want it to validate. + +00:15:19.740 --> 00:15:22.580 +So jump over to Sentry and set up code review for yourself. + +00:15:23.060 --> 00:15:27.120 +Just visit talkpython.fm/seer-code-review. + +00:15:27.460 --> 00:15:30.240 +The link is in your podcast player show notes and on the episode page. + +00:15:30.720 --> 00:15:32.980 +Thank you to Sentry for supporting Talk Python To Me. + +00:15:34.400 --> 00:15:34.860 +Why Marimo? + +00:15:35.140 --> 00:15:37.260 +I know I had Axion people talked about. + +00:15:37.390 --> 00:15:41.480 +When I look at this, it feels like a super modern UI that is just, + +00:15:41.750 --> 00:15:47.720 +it's got a lot of polish and a lot of, it feels 2025 and working with it, right? + +00:15:48.440 --> 00:15:53.940 +And it solves some of the, I consider Jupyter Notebooks like the world's most insane, + +00:15:54.230 --> 00:15:58.599 +not the world's most insane, the second most insane series of go-to statements + +00:15:58.600 --> 00:16:01.060 +that have no record of how you got the go-to, + +00:16:01.220 --> 00:16:02.340 +you know, it doesn't even say go to 10. + +00:16:02.460 --> 00:16:03.780 +It's like, well, what did you go and run? + +00:16:04.000 --> 00:16:05.520 +I don't know, like, look at the numbers, + +00:16:05.760 --> 00:16:07.980 +but there's like potentially, you know, + +00:16:08.140 --> 00:16:09.400 +some lost as you rerun them. + +00:16:10.020 --> 00:16:11.060 +And it solves that problem as well. + +00:16:11.140 --> 00:16:13.440 +But like, you know, what's the elevator pitch for Marimo? + +00:16:13.760 --> 00:16:15.420 +Yeah, my elevator pitch for Marimo + +00:16:15.840 --> 00:16:18.940 +is that I think notebooks are incredibly important + +00:16:19.260 --> 00:16:22.000 +and that it's hard to deny that like they're a tool + +00:16:22.140 --> 00:16:24.180 +that many folks use to do their daily work. + +00:16:25.480 --> 00:16:26.799 +But there are very few like guardrails + +00:16:26.820 --> 00:16:30.180 +to help you do that work and work on a team with other folks. + +00:16:30.400 --> 00:16:33.200 +And so I think one of the big selling points to me with Marimo + +00:16:33.360 --> 00:16:36.240 +is just that you sort of get to free yourself + +00:16:36.420 --> 00:16:37.840 +from thinking about those go-to statements, + +00:16:38.120 --> 00:16:41.580 +and instead you have this very deterministic execution + +00:16:41.920 --> 00:16:43.860 +that sort of just feels natural. + +00:16:44.140 --> 00:16:46.280 +As you write cells, they will re-execute. + +00:16:47.600 --> 00:16:50.000 +And then on top of that, I think that Jupyter and Marimo + +00:16:50.120 --> 00:16:51.640 +have slightly different goals. + +00:16:51.880 --> 00:16:53.780 +So Jupyter, when I showed that diagram, + +00:16:53.840 --> 00:16:54.720 +we had that diagram before, + +00:16:54.980 --> 00:16:58.400 +you have this split between these diversity of front ends. + +00:16:58.460 --> 00:17:00.220 +So that's like you could use it in Colab, + +00:17:00.240 --> 00:17:02.160 +you could use a Jupyter kernel inside of VS Code, + +00:17:02.280 --> 00:17:05.620 +you could use a Jupyter kernel sort of within the Jupyter CLI. + +00:17:06.000 --> 00:17:08.780 +But then there's all another requirement of that ecosystem + +00:17:08.780 --> 00:17:10.360 +is to support many different kernels. + +00:17:10.480 --> 00:17:12.640 +So it's not just Python, it could be an R kernel, + +00:17:12.760 --> 00:17:13.740 +it could be a Julia kernel. + +00:17:14.079 --> 00:17:17.240 +And so I think ultimately attention that ends up happening + +00:17:17.380 --> 00:17:20.060 +is like a priority of that ecosystem is to support all these languages + +00:17:20.360 --> 00:17:21.420 +and all these different front ends. + +00:17:21.540 --> 00:17:23.680 +And instead, by just focusing on Python, + +00:17:23.980 --> 00:17:26.800 +we can really try to develop a really integrated + +00:17:27.100 --> 00:17:30.040 +and rich experience specifically for Python notebooks. + +00:17:30.360 --> 00:17:32.800 +So the trade-off there being some folks come and they say, + +00:17:33.040 --> 00:17:34.400 +oh, we'd love Marimo, but for R. + +00:17:34.420 --> 00:17:36.480 +And we're like, well, we can't really do that. + +00:17:37.100 --> 00:17:38.080 +It's called JupyterLab. + +00:17:38.800 --> 00:17:40.900 +Yeah, but we are hyper-focused on Python + +00:17:41.040 --> 00:17:44.000 +and that lets us integrate with other trends within the ecosystem too. + +00:17:44.180 --> 00:17:46.960 +So we're a huge fan of all the work from the astral folks. + +00:17:47.600 --> 00:17:50.400 +And we have a tight integration with uv, for example, + +00:17:50.460 --> 00:17:54.300 +to allow you to sort of use that PEP 723 script metadata. + +00:17:54.870 --> 00:17:57.500 +And as you're working, we'll like install packages for you + +00:17:57.500 --> 00:17:59.760 +and write that metadata so that at the end, + +00:17:59.770 --> 00:18:02.140 +you can send that document to someone else + +00:18:02.210 --> 00:18:03.660 +and they can like bootstrap their notebook + +00:18:03.750 --> 00:18:04.580 +and get all their dependencies + +00:18:04.910 --> 00:18:06.160 +and have their environment set up. + +00:18:06.360 --> 00:18:10.340 +And I think that type of like last mile with these notebooks + +00:18:10.600 --> 00:18:11.780 +is something that's been hard to get at + +00:18:11.890 --> 00:18:14.020 +without really focusing on like an ecosystem. + +00:18:14.320 --> 00:18:16.400 +That last mile though, that's a lot of polish. + +00:18:16.810 --> 00:18:20.440 +And that's often what kind of gets lost + +00:18:20.600 --> 00:18:24.960 +in projects where, I don't know, it's not exactly the focus, right? + +00:18:25.220 --> 00:18:28.220 +It's something where you want to take on the most interesting features + +00:18:28.880 --> 00:18:31.660 +and it's not like it's necessarily a company + +00:18:31.840 --> 00:18:32.640 +and there's somebody who's like, + +00:18:32.700 --> 00:18:35.360 +no, we're polishing every little rough edge, period. + +00:18:35.760 --> 00:18:36.100 +You know what I mean? + +00:18:36.560 --> 00:18:38.420 +And I think you mentioned Astral and uv. + +00:18:38.580 --> 00:18:41.840 +I think that is an example of they got funding + +00:18:42.040 --> 00:18:44.880 +and it's like, okay, we're not going to do 95% of the packaging. + +00:18:45.000 --> 00:18:47.080 +We're going to do 99.9% of the packaging. + +00:18:47.220 --> 00:18:47.600 +You know what I mean? + +00:18:47.820 --> 00:18:49.520 +And it's made a tremendous difference. + +00:18:50.020 --> 00:18:50.200 +Yeah. + +00:18:50.560 --> 00:18:53.280 +Yeah, I mean, I have plenty to say about Astro. + +00:18:53.300 --> 00:18:55.280 +We love all their tooling over here too. + +00:18:56.060 --> 00:18:56.700 +Yeah, so do I. + +00:18:56.900 --> 00:19:01.740 +I switched all my stuff to Astro and it's to uv and rough and yeah, super, super neat. + +00:19:02.720 --> 00:19:08.360 +Okay, so another thing that you all just released is a new VS Code plugin. + +00:19:08.560 --> 00:19:13.060 +So I guess one of the aspects of Marimo that is not necessarily apparent from looking at + +00:19:13.160 --> 00:19:18.640 +it going there is that it's backed by a Python file, not a JSON file that contains both the + +00:19:18.660 --> 00:19:24.040 +inputs and outputs, which is like one of the big shortcomings of Jupyter. Like how much of Jupyter + +00:19:24.140 --> 00:19:29.260 +would have benefited if it just had an input in an output file? So you don't check in the output + +00:19:29.460 --> 00:19:34.780 +file potentially. You know what I mean? Yeah. Yeah. We, I think that was pretty visionary or maybe it + +00:19:34.820 --> 00:19:40.360 +was just, it's more that we have this ability to observe what's happened in the Jupyter ecosystem + +00:19:40.560 --> 00:19:43.840 +and make adjustments from there. A little bit of a second mover benefit of like, we saw that, + +00:19:44.000 --> 00:19:44.960 +That was 80% good. + +00:19:45.900 --> 00:19:46.240 +Definitely. + +00:19:46.620 --> 00:19:47.680 +Yeah, it was surprising. + +00:19:48.100 --> 00:19:50.640 +I mean, I think to rip more on Marimo, + +00:19:50.940 --> 00:19:51.940 +just there are all these, + +00:19:52.240 --> 00:19:53.800 +when people ask that question of like why Marimo, + +00:19:54.000 --> 00:19:55.540 +I think there are all these like very small things, + +00:19:55.900 --> 00:19:59.300 +but like that in net, I think are certainly + +00:19:59.440 --> 00:20:02.240 +like a order of magnitude kind of like richer experience + +00:20:02.440 --> 00:20:02.920 +for our users. + +00:20:04.320 --> 00:20:06.540 +But it's funny how many times just the fact + +00:20:06.720 --> 00:20:08.180 +that like our notebooks are get diffable, + +00:20:08.500 --> 00:20:10.080 +that's like one of the first things + +00:20:10.180 --> 00:20:11.420 +that folks like really latch onto + +00:20:11.600 --> 00:20:12.680 +and love about Marimo notebooks. + +00:20:13.540 --> 00:20:19.100 +Yeah, yeah, very cool. And so the VS Code extension, what's the, like, how's this, how's this help us? + +00:20:19.720 --> 00:20:30.540 +Yeah, so I see this as, I think the best experience with using Marimo will always be sort of our CLI or like our, our web interface that we've like really curated around this experience. + +00:20:31.260 --> 00:20:36.020 +But we have a broad range of users that want to collaborate or work on Maremo documents. + +00:20:36.340 --> 00:20:44.360 +And so some of them are like, you know, I live in my VS Code, my cursor, my Windsurf, my IDE, and like I never want to leave this environment. + +00:20:44.580 --> 00:20:48.880 +And so getting them to come and like collaborate on these documents has been a little bit of a barrier there. + +00:20:49.160 --> 00:20:57.000 +So they could, of course, edit a Python file on disk, but having like a native notebook view like they do for Jupyter was something that we sort of lacked in those environments. + +00:20:57.360 --> 00:21:06.140 +And so we made it a priority shortly after I joined the team where we really wanted to address that user base of folks that want to live in that editor environment. + +00:21:06.700 --> 00:21:16.480 +And sort of from the ground up wrote this new extension where we have a very integrated view with sort of reactive outputs, similarly to what we have in our user interface. + +00:21:16.680 --> 00:21:21.360 +But it sort of lives inside of the VS Code ecosystem and uses the VS Code APIs. + +00:21:22.220 --> 00:21:23.640 +Yeah, that makes a lot of sense. + +00:21:24.100 --> 00:21:25.760 +and it's available on the, what is it? + +00:21:25.920 --> 00:21:30.400 +VSX open marketplace for like cursor and anti-gravity. + +00:21:30.540 --> 00:21:32.080 +That's a thing as of like three days ago. + +00:21:32.880 --> 00:21:34.960 +Yes, all these different skins of VS Code, + +00:21:37.320 --> 00:21:38.820 +our extension should work as well. + +00:21:38.870 --> 00:21:40.460 +And we publish it to both marketplaces. + +00:21:40.650 --> 00:21:42.640 +So folks should be able to install it and get started. + +00:21:42.940 --> 00:21:43.980 +It's a weird time for editors. + +00:21:44.800 --> 00:21:45.080 +Yes. + +00:21:45.280 --> 00:21:46.960 +It was really kind of straight. + +00:21:47.380 --> 00:21:49.540 +When I first started Talk Python 11 years ago + +00:21:49.640 --> 00:21:50.280 +or whatever it was, + +00:21:50.550 --> 00:21:52.560 +I would ask people what editor they use + +00:21:52.590 --> 00:21:53.660 +at the end of the show + +00:21:53.680 --> 00:21:58.160 +I didn't know what they were going to say. It could have been anything. And then it just narrowed + +00:21:58.220 --> 00:22:05.160 +down to VS Code, PyCharm. Those are the answers. And now it's kind of back to a lot of options + +00:22:05.310 --> 00:22:11.220 +again. But it generally works in the VS Code variants as well. All right. Super cool. Okay. + +00:22:12.720 --> 00:22:17.640 +Let's talk about just what is a Jupyter widget? Yeah. So like I mentioned before, Jupyter really + +00:22:18.100 --> 00:22:22.900 +reaches, and Marimo also, reach across this chasm between your Python code that's running in some, + +00:22:23.100 --> 00:22:26.820 +we call it like a kernel environment that's like a stateful environment that has your variables and + +00:22:26.820 --> 00:22:30.960 +when you run cells like things update there and then on the front end there's like the cell outputs + +00:22:31.320 --> 00:22:36.080 +that um are reflection of like whatever is sort of like living in the kernel and a jupiter widget + +00:22:36.320 --> 00:22:41.780 +is like a very interesting part of that like uh of that uh crossing of that chasm and that + +00:22:42.000 --> 00:22:46.340 +you can actually author both the front end code and the back end code to talk to one another so + +00:22:46.480 --> 00:22:50.320 +typically it's like you're if you write a python library you're only writing something that runs + +00:22:50.320 --> 00:22:54.400 +on the server. And then if you're making a Jupyter extension, for example, that's something that only + +00:22:54.580 --> 00:22:59.420 +lives in the front end. But a widget allows a library author to connect those two worlds together. + +00:22:59.710 --> 00:23:05.260 +And so for your small object that lives in the Python kernel, you can define ways that it can + +00:23:05.540 --> 00:23:11.700 +render itself and also be updated from that front end. And so it's a very powerful mechanism that's + +00:23:11.880 --> 00:23:17.159 +officially a part of Jupyter that allows you to extend your output cells with, I like to think of + +00:23:17.120 --> 00:23:21.400 +it as these web-based superpowers to be able to update your code in the kernel. + +00:23:21.680 --> 00:23:26.740 +You have a really interesting point that you made in your SciPy talk, saying basically, + +00:23:27.220 --> 00:23:34.460 +we can do lots of UI stuff in notebooks, but often those are kind of like isolated snapshots, + +00:23:35.020 --> 00:23:40.040 +right? Maybe you have a graph and there's like outliers or something along those lines, + +00:23:40.460 --> 00:23:45.580 +but you can see them, but you can't like programmatically interact with them in the + +00:23:45.480 --> 00:23:50.140 +way that you could a widget, whereas you might drag and select to get a zoom area that actually + +00:23:50.300 --> 00:23:56.300 +becomes a data frame, or you could move some widgets that then redraw the picture, but also + +00:23:56.800 --> 00:24:00.100 +becomes something you could program against in like a machine learning algorithm or whatever, + +00:24:00.320 --> 00:24:05.100 +right? Maybe I think that's a pretty powerful distinction that people want to have in mind as + +00:24:05.100 --> 00:24:10.040 +we dive into anywidget. Yeah, definitely. I like to think of it as like in a traditional like + +00:24:10.200 --> 00:24:15.440 +REPL kind of environment, like your two modes of input are sort of like writing code and running + +00:24:15.960 --> 00:24:19.700 +so you can write the code and you can run the cell and then you can observe the output but in order + +00:24:19.730 --> 00:24:24.620 +to like act on something that you see in that output you have to write more code so um if you + +00:24:24.780 --> 00:24:29.100 +have a plot for example that you produce from one of those outputs if there's some outliers or + +00:24:29.170 --> 00:24:33.740 +something a very natural thing to think is what is that point and you want to like circle that point + +00:24:33.980 --> 00:24:38.120 +and that is much easier to do with a gesture than it is to express as some sort of like + +00:24:38.940 --> 00:24:44.460 +query in in uh typing out some program and kind of looking at the axes to say like what are those + +00:24:44.480 --> 00:24:50.500 +points there. And so I think widgets really allow for you to be creative in terms of how you + +00:24:51.760 --> 00:24:56.080 +might like another mode of input for adjusting things that might be in the kernel. And like + +00:24:56.160 --> 00:25:00.640 +you're saying, now you can transform a selection into just a data frame that you get out and now, + +00:25:00.840 --> 00:25:04.680 +and then you're off to the races with running the next part of your program. And so I think it, + +00:25:04.880 --> 00:25:09.540 +it adds this degree of freedom to like let the algorithms and data, like use code when that's + +00:25:09.640 --> 00:25:14.440 +useful. And then for those pieces where you want to step in as a human and maybe use a gesture + +00:25:14.440 --> 00:25:16.380 +and that's easier to express as a gesture, + +00:25:16.580 --> 00:25:18.720 +then we have that mechanism now + +00:25:18.780 --> 00:25:20.880 +to allow you to extend your analyses. + +00:25:21.340 --> 00:25:21.680 +Right. + +00:25:21.900 --> 00:25:24.220 +If you don't need pictures and interaction, + +00:25:24.440 --> 00:25:26.000 +just do it in Python straight. + +00:25:26.400 --> 00:25:26.940 +You know what I mean? + +00:25:27.040 --> 00:25:28.500 +You don't need notebooks. + +00:25:28.820 --> 00:25:30.580 +And part of the value of notebooks + +00:25:30.980 --> 00:25:34.980 +is this mix of like a visual plus code plus visual + +00:25:35.200 --> 00:25:37.480 +and this sort of iterative back and forth between those. + +00:25:38.080 --> 00:25:40.560 +And widgets reconnect the visual + +00:25:40.680 --> 00:25:42.400 +back to the code, I guess, right? + +00:25:42.680 --> 00:25:43.280 +Yeah, exactly. + +00:25:43.640 --> 00:25:47.000 +I like to think of it as the types of workflows in notebooks + +00:25:47.170 --> 00:25:48.920 +are not typically batch scripts. + +00:25:49.220 --> 00:25:51.220 +I mean, in Marimo, you can write your notebook + +00:25:51.330 --> 00:25:53.580 +and run it as a script because it's just Python. + +00:25:54.300 --> 00:25:56.060 +But often when people are working + +00:25:56.240 --> 00:25:57.660 +and really getting their hands on data, + +00:25:57.790 --> 00:25:59.240 +these are long-lived sessions + +00:25:59.510 --> 00:26:00.940 +where you have a lot of data in memory + +00:26:01.640 --> 00:26:03.680 +and you really want to get your hands on that data. + +00:26:03.880 --> 00:26:05.160 +And I think if you only had the ability + +00:26:05.270 --> 00:26:08.320 +to write and run cells and not add that extra dimension, + +00:26:10.000 --> 00:26:11.840 +it would be pretty limited to what you could actually do. + +00:26:12.160 --> 00:26:22.780 +And so something like widgets allows you to really extend and be creative with how you'd like to explore that data and sort of like debug your data as it's in memory and get your hands on it. + +00:26:22.960 --> 00:26:23.140 +Yeah. + +00:26:23.400 --> 00:26:23.500 +Yeah. + +00:26:23.500 --> 00:26:26.760 +I mean, that's what blew the world open when all that came around, right? + +00:26:27.100 --> 00:26:27.280 +Yeah. + +00:26:27.580 --> 00:26:30.460 +So that brings us to anywidget. + +00:26:30.760 --> 00:26:31.980 +Why a new widget framework? + +00:26:32.160 --> 00:26:35.360 +Why not just IPy widgets or whatever? + +00:26:36.160 --> 00:26:37.000 +Yeah, it's a great question. + +00:26:37.360 --> 00:26:42.360 +And it's something that when I started my PhD, like I was not, I guess I didn't really have a hand on either. + +00:26:42.820 --> 00:26:49.340 +But AnyWidget was definitely born out of a need of myself to like solve a problem that I was running into during the PhD. + +00:26:49.580 --> 00:26:54.260 +So as I mentioned before, we had all these visualization tools that we were building sort of in. + +00:26:54.790 --> 00:26:59.380 +And specifically, one of the tools I was building was a like a web-based genome browser. + +00:26:59.650 --> 00:27:07.320 +And so this is something that people put their genomes into and you can align different tracks and you can zoom in and try to understand like different types of things. + +00:27:07.340 --> 00:27:09.660 +types of functional assays in genomics. + +00:27:10.660 --> 00:27:12.520 +And then we had users that wanted to use these inside + +00:27:12.520 --> 00:27:13.340 +of computational notebooks. + +00:27:14.760 --> 00:27:16.560 +And one thing that immediately happened + +00:27:16.940 --> 00:27:18.740 +when we tried to build these integrations for notebooks + +00:27:19.120 --> 00:27:21.240 +is that you'd spend all this engineering effort trying + +00:27:21.320 --> 00:27:23.620 +to build an integration for a specific notebook. + +00:27:24.220 --> 00:27:26.660 +But then someone would come in and say, oh, I'd actually + +00:27:26.840 --> 00:27:28.060 +want this to work in Google Colab, + +00:27:28.150 --> 00:27:29.540 +or I need this to work in VS Code, + +00:27:29.610 --> 00:27:31.860 +or I need this to work in this other environment. + +00:27:34.180 --> 00:27:37.300 +And all of those environments are very similar on the back end + +00:27:37.320 --> 00:27:38.760 +but they're very different on the front end + +00:27:38.880 --> 00:27:41.740 +in terms of the way that you can register like front end code. + +00:27:41.880 --> 00:27:45.240 +And so it ended up putting a lot on a library author like myself + +00:27:45.580 --> 00:27:47.900 +to try to build those adapters for each of those environments. + +00:27:48.480 --> 00:27:51.540 +And so anywidget was sort of an attempt to standardize + +00:27:51.660 --> 00:27:53.700 +maybe how we can write that front end code + +00:27:54.040 --> 00:27:54.960 +such that it's more, + +00:27:55.200 --> 00:27:57.180 +and then we can build those adapters in one place + +00:27:57.380 --> 00:27:59.500 +that then we can support all those different environments. + +00:27:59.540 --> 00:28:00.160 +Right, right. + +00:28:00.220 --> 00:28:01.300 +We talked about building the widgets + +00:28:01.980 --> 00:28:03.500 +largely in JavaScript earlier already, + +00:28:03.860 --> 00:28:06.740 +but I think what people maybe who haven't tried this + +00:28:06.780 --> 00:28:13.240 +don't realize is how many variations you would have to build to say, have I also, I wanted to + +00:28:13.400 --> 00:28:19.160 +work in Jupyter and I wanted to work in Colab and you've got to build and publish the PyPI and NPM + +00:28:19.680 --> 00:28:24.960 +and right. There's like, so everyone who built one of those had to, and wanted to cross use, + +00:28:25.580 --> 00:28:30.100 +they had to do all those adaptation, adaptation. So, and your thought was basically, well, + +00:28:30.420 --> 00:28:34.940 +I could create an adapter that everyone could use and then we don't have to do this ever again. + +00:28:35.220 --> 00:28:48.920 +Yeah, exactly. And to what you're saying as well, I think that adapter craziness that's in the web ecosystem of like, oh my gosh, I had to transpile my code into all these different formats. + +00:28:49.100 --> 00:28:58.420 +It's kind of a nightmare in terms of tooling. And it's something that came as a result of the state of the JavaScript ecosystem being pretty immature before 2018. + +00:29:01.060 --> 00:29:03.800 +And basically because JavaScript didn't have a module system. + +00:29:04.420 --> 00:29:06.900 +And so everyone had to come up with a way to bundle code. + +00:29:07.000 --> 00:29:08.740 +And there are all these third-party adapters. + +00:29:08.860 --> 00:29:12.080 +They're figuring out a way that they could load JavaScript modules. + +00:29:12.340 --> 00:29:17.100 +And so whatever these tools chose and the way that they chose to load their front-end code + +00:29:17.120 --> 00:29:20.860 +would affect the way that me as the library author would need to package up the code. + +00:29:21.000 --> 00:29:26.720 +By modules and loading, you're talking where people might see the word require at the top + +00:29:26.780 --> 00:29:27.420 +or something like that. + +00:29:27.560 --> 00:29:28.240 +Kind of like our import. + +00:29:29.020 --> 00:29:29.480 +Yes, exactly. + +00:29:29.840 --> 00:29:32.020 +It's very much like import statements in Python. + +00:29:32.320 --> 00:29:35.140 +And in fact, the official, if you open up your browser, + +00:29:35.180 --> 00:29:39.460 +you can just type, you can use a syntax that's called ES modules + +00:29:39.880 --> 00:29:41.140 +and type in import. + +00:29:41.640 --> 00:29:45.120 +And you can import a bit of JavaScript code via URL. + +00:29:45.560 --> 00:29:48.960 +And this just works natively in the browsers like today, + +00:29:50.020 --> 00:29:54.060 +but wasn't sort of standardized until late 2015, or late 2018. + +00:29:54.420 --> 00:29:58.399 +So you mentioned require, that's actually a different module system + +00:29:58.400 --> 00:30:00.120 +that's not based in browsers. + +00:30:00.280 --> 00:30:02.160 +And if you were to type require in your browser, + +00:30:02.400 --> 00:30:02.960 +that would not work. + +00:30:03.220 --> 00:30:04.400 +And that is sort of like the tension + +00:30:04.560 --> 00:30:06.020 +that was the JavaScript ecosystem. + +00:30:06.340 --> 00:30:08.240 +So I was aware of some of the trends + +00:30:08.300 --> 00:30:09.180 +and things that have happened + +00:30:09.700 --> 00:30:10.960 +post this new module system + +00:30:11.060 --> 00:30:12.560 +because I've been working in the web ecosystem. + +00:30:13.100 --> 00:30:13.980 +And yet when I came to work + +00:30:14.080 --> 00:30:15.440 +on like Jupyter widgets specifically, + +00:30:15.680 --> 00:30:16.220 +there was like, + +00:30:16.480 --> 00:30:18.040 +I felt like I was writing code + +00:30:18.140 --> 00:30:20.560 +like I had been like a while ago for the browser. + +00:30:20.920 --> 00:30:22.960 +And so the idea behind anywidget was to like, + +00:30:23.240 --> 00:30:25.300 +let's just simplify this so like that my source + +00:30:25.780 --> 00:30:26.860 +and the code that I publish + +00:30:26.880 --> 00:30:29.460 +is more like what the browser understands natively. + +00:30:29.710 --> 00:30:31.800 +And then we take care of this like transpilation stuff + +00:30:32.560 --> 00:30:34.640 +once for everybody and package that up + +00:30:34.730 --> 00:30:36.640 +inside of anywidget to deal with sort of like + +00:30:36.720 --> 00:30:38.920 +the legacy module systems of like JupyterLab, + +00:30:39.080 --> 00:30:41.260 +VS Code, and a Google collab. + +00:30:42.540 --> 00:30:44.160 +This portion of Talk Python To Me + +00:30:44.240 --> 00:30:46.600 +is brought to you by JetBrains and the PyCharm team. + +00:30:47.440 --> 00:30:49.600 +The PSF, the Python Software Foundation, + +00:30:50.060 --> 00:30:52.540 +is a nonprofit organization behind Python. + +00:30:53.320 --> 00:30:55.100 +They run or oversee the conferences, + +00:30:55.680 --> 00:30:57.560 +handle all the legal business to do with Python, + +00:30:58.340 --> 00:31:01.320 +oversee the steering council and other groups, and much more. + +00:31:01.690 --> 00:31:05.140 +But let's focus in on one key word here, non-profit. + +00:31:05.940 --> 00:31:07.720 +Unlike some software ecosystems, + +00:31:07.870 --> 00:31:10.920 +which grew out of and are supported by a single tech giant, + +00:31:11.520 --> 00:31:14.240 +think.NET and Microsoft or Swift and Apple, + +00:31:15.440 --> 00:31:18.020 +Python is by developers for developers. + +00:31:18.660 --> 00:31:20.780 +That makes it a truly special place. + +00:31:21.080 --> 00:31:24.180 +Everyone here is here because they chose to be here. + +00:31:24.760 --> 00:31:25.460 +It's our garden. + +00:31:25.640 --> 00:31:30.960 +and we have to tend it. That means supporting the ecosystem, and a big part of that is supporting + +00:31:31.080 --> 00:31:36.200 +the PSF. That's why I was thrilled when JetBrains and the PyCharm team in particular reached out to + +00:31:36.200 --> 00:31:40.940 +me to help spread the word about their PSF fundraiser. Here's the deal. You can purchase + +00:31:41.260 --> 00:31:48.540 +PyCharm Pro for 30% off this week until December 19th, 2025. And when you do, not only do you get + +00:31:48.540 --> 00:31:54.900 +a deal on PyCharm, but the PyCharm team will donate all the proceeds to the PSF. Let me say + +00:31:54.860 --> 00:32:00.680 +that again because it's easy to miss. They will donate all the proceeds to the PSF. And a positive + +00:32:00.780 --> 00:32:05.700 +addition this year is that this also applies to renewals, not just new customers. So if you're + +00:32:05.920 --> 00:32:11.580 +already a PyCharm fan and customer, renew this week and you get a discount plus you support the PSF. + +00:32:12.180 --> 00:32:16.160 +To take part in this, you'll need to be sure to use Talk Python's exclusive promo code, + +00:32:17.179 --> 00:32:22.120 +STRONGERPYTHON, all caps, two words. That's STRONGERPYTHON for the code and the link is + +00:32:22.140 --> 00:32:29.380 +talkpython.fm/PyCharm dash PSF dash 2025. Both of these are in your podcast player's show + +00:32:29.480 --> 00:32:36.180 +notes. Thank you to PyCharm for supporting the show. I'm trying to decide if I feel like it's + +00:32:36.350 --> 00:32:41.280 +in the interest, if the motivation of any of those platforms would have been to create something like + +00:32:41.400 --> 00:32:44.880 +this, right? Like the Jupyter people are like, I don't know the problem. You just write for Jupyter + +00:32:44.880 --> 00:32:49.100 +and it works. And the Colab people are like, I don't see a problem. You just write for Colab and + +00:32:49.080 --> 00:32:55.780 +it works, right? Like what is the platform people's motivation to create integrations and smooth the + +00:32:56.260 --> 00:33:00.200 +transition to the other ones? That's one tension. On the other though, they might have more stuff + +00:33:00.420 --> 00:33:04.920 +come to them if they make it more reusable across them, right? So it's interesting that none of them + +00:33:05.080 --> 00:33:09.620 +actually did that. Yeah, I think, I mean, in hindsight, it makes sense why someone in my + +00:33:09.820 --> 00:33:14.340 +position, like, I guess, focused on this because it was like, I had two widgets and then times the + +00:33:14.440 --> 00:33:17.580 +number of platforms. And then if something broke, you'd have to go fix it a bunch of different + +00:33:17.600 --> 00:33:20.940 +places and it just becomes a maintenance nightmare. + +00:33:22.120 --> 00:33:26.080 +But one thing that's been really interesting is I think that there is a good pressure to + +00:33:26.860 --> 00:33:28.600 +support a healthy ecosystem. + +00:33:29.020 --> 00:33:34.240 +So I'd say before anywidget, it was pretty complicated to create custom widgets. + +00:33:34.500 --> 00:33:39.400 +And so one of my goals was I want the feeling of being able to author a widget to be very + +00:33:39.700 --> 00:33:44.160 +similar to the way that you can copy and paste code into a notebook, but move it into a file. + +00:33:44.420 --> 00:33:45.940 +And then you could publish that to PyPI. + +00:33:46.380 --> 00:33:48.280 +And that workflow always worked for Python, + +00:33:48.820 --> 00:33:51.560 +but there was no way to start prototyping a widget + +00:33:51.820 --> 00:33:53.940 +inside of a notebook before and then move it out. + +00:33:54.160 --> 00:33:55.840 +And I really wanted to bring sort of, + +00:33:56.020 --> 00:33:57.220 +to lower that barrier to entry. + +00:33:58.360 --> 00:33:59.900 +And then what I think came out of that + +00:33:59.940 --> 00:34:01.860 +is you get a much richer ecosystem of people + +00:34:01.920 --> 00:34:04.360 +that are willing to try and make things. + +00:34:05.100 --> 00:34:07.220 +And then when there's a cool widget that comes out, + +00:34:07.320 --> 00:34:08.600 +then that's a good positive pressure + +00:34:08.760 --> 00:34:09.760 +for other ecosystems, + +00:34:10.100 --> 00:34:11.280 +because then people are trying to request, + +00:34:11.300 --> 00:34:13.700 +they go, hey, I want this widget to work in your environment. + +00:34:13.940 --> 00:34:16.340 +And that puts more pressure on various environments + +00:34:16.360 --> 00:34:19.000 +to implement sort of a more standardized approach. + +00:34:19.860 --> 00:34:22.419 +Or adopt a adapting layer like anywidget. + +00:34:22.720 --> 00:34:23.419 +Yep. Yeah, exactly. + +00:34:24.120 --> 00:34:25.320 +Yeah, to sort of back you up here, + +00:34:26.280 --> 00:34:28.560 +Kostal says, thank you for building anywidget. + +00:34:28.960 --> 00:34:30.700 +Having gone through creating an IPy widget, + +00:34:30.879 --> 00:34:31.620 +it was a lot of work. + +00:34:32.000 --> 00:34:33.440 +So yeah, exactly. + +00:34:35.460 --> 00:34:38.919 +Let's just take a tour through the widget gallery. + +00:34:39.399 --> 00:34:41.000 +Like what widgets are available here? + +00:34:42.020 --> 00:34:43.120 +Got some favorites? + +00:34:43.540 --> 00:34:44.919 +Yeah, we definitely have some favorites. + +00:34:45.659 --> 00:34:48.820 +I would say that one of the early adopters of AnyWidget + +00:34:48.820 --> 00:34:51.139 +is a fairly popular plotting library called Altair. + +00:34:53.379 --> 00:34:56.220 +And it allows to do exactly what you were talking about earlier + +00:34:56.419 --> 00:34:59.840 +with selecting points and allowing you to get those back + +00:34:59.890 --> 00:35:01.160 +as data frames in the kernel. + +00:35:01.420 --> 00:35:03.520 +So for a while, the way that Altair worked, + +00:35:04.150 --> 00:35:09.860 +and I think it still by default works inside of a Jupyter environment, + +00:35:09.870 --> 00:35:12.160 +is that it creates sort of an interactive output + +00:35:12.250 --> 00:35:13.800 +that isn't connected back to the kernel. + +00:35:14.160 --> 00:35:14.880 +So you can get your output. + +00:35:15.220 --> 00:35:21.080 +It feels interactive because you can zoom in and you can select and you can do all the things, but it's just a view. + +00:35:21.720 --> 00:35:22.080 +Exactly. + +00:35:22.340 --> 00:35:26.760 +That state, I like to think of it as like is trapped in the output and you can't get it back in the notebook. + +00:35:27.080 --> 00:35:33.780 +And so what the AnyWidget does behind the scenes is it allows for that output to communicate back with the kernel, + +00:35:34.240 --> 00:35:40.180 +which then allows you to update an object or a selection and then run your cell again and view some output. + +00:35:40.580 --> 00:35:43.160 +And just that bi-directional communication + +00:35:43.540 --> 00:35:45.200 +between your kernel and your front end + +00:35:45.380 --> 00:35:47.220 +allows you to do things like create a data frame + +00:35:47.480 --> 00:35:48.960 +that is updated when you select. + +00:35:50.360 --> 00:35:50.520 +Yeah. + +00:35:51.120 --> 00:35:51.260 +Nice. + +00:35:51.720 --> 00:35:52.680 +Yeah, it's sort of that example + +00:35:53.280 --> 00:35:55.620 +we talked about the outliers before, right? + +00:35:56.020 --> 00:35:56.540 +Yeah, exactly. + +00:35:58.240 --> 00:35:59.300 +Altair's super cool. + +00:35:59.880 --> 00:36:01.560 +You know, I talked to Jake Vanderplass about it + +00:36:01.580 --> 00:36:04.080 +when it first came out and it's very beautiful. + +00:36:04.540 --> 00:36:06.260 +Yeah, Altair was actually, I think, + +00:36:06.600 --> 00:36:09.280 +how I got into open source contributions forever ago. + +00:36:09.680 --> 00:36:11.380 +Like I made like documentation examples + +00:36:11.620 --> 00:36:12.280 +and it was very like, + +00:36:13.530 --> 00:36:14.860 +it was very fun to come full circle + +00:36:14.970 --> 00:36:16.660 +to actually have like a dependency + +00:36:17.020 --> 00:36:18.540 +that somehow like got back into that library + +00:36:18.990 --> 00:36:19.560 +many years later. + +00:36:20.060 --> 00:36:21.880 +Now you build something that makes it more, + +00:36:22.200 --> 00:36:24.200 +that supercharges Altair + +00:36:24.400 --> 00:36:26.600 +because now it has like bidirectional data integration + +00:36:27.360 --> 00:36:27.680 +sort of thing. + +00:36:27.970 --> 00:36:28.140 +Yeah, yeah. + +00:36:28.300 --> 00:36:28.780 +Yeah, that's wild. + +00:36:29.060 --> 00:36:30.820 +Yeah, maybe just talk to people really quick about that. + +00:36:30.970 --> 00:36:33.780 +Like you don't have to write a revolutionary feature + +00:36:33.810 --> 00:36:34.740 +to be part of open source. + +00:36:35.140 --> 00:36:36.280 +Yeah, yeah, definitely. + +00:36:36.610 --> 00:36:39.640 +I think my entry to open source was just + +00:36:39.660 --> 00:36:45.260 +I got interested in a plotting library and they had some open tickets for making examples for their documentation. + +00:36:45.750 --> 00:36:47.880 +And I wanted to learn how to use the plotting library. + +00:36:48.090 --> 00:36:49.840 +So then I started contributing to them. + +00:36:50.170 --> 00:36:58.680 +And it was really like those interactions, I think, with maintainers that kind of got a tick in my head where I was like, oh, I think I like this way of working and communicate. + +00:36:58.880 --> 00:37:07.400 +Like there's a lot of, I don't know, oftentimes the challenges in open source aren't so technical as they are just social and figuring out how to communicate expectations to different users. + +00:37:07.660 --> 00:37:10.720 +And I think specifically interacting with Jake Vander Plaats + +00:37:10.720 --> 00:37:11.880 +and some of those issues, + +00:37:13.360 --> 00:37:15.080 +I think I learned a lot about that + +00:37:15.300 --> 00:37:17.920 +and I was attracted to trying to find new ways + +00:37:18.120 --> 00:37:20.560 +to work on problems in open source. + +00:37:20.860 --> 00:37:21.560 +Yeah, awesome. + +00:37:21.880 --> 00:37:23.200 +All right, well, that's number one + +00:37:23.500 --> 00:37:25.780 +with 10,000 favorites out of the community. + +00:37:27.000 --> 00:37:27.560 +What are some others? + +00:37:28.180 --> 00:37:29.500 +Some of the other, I would say like, + +00:37:29.780 --> 00:37:32.360 +so Vincent Womerdam, who is one of my colleagues + +00:37:32.510 --> 00:37:34.800 +and you've had it on the podcast any times before. + +00:37:34.870 --> 00:37:35.520 +A couple times, yeah. + +00:37:35.680 --> 00:37:38.100 +Yeah, it's created this draw data widget. + +00:37:38.240 --> 00:37:39.920 +And I would say when this widget came out, + +00:37:40.020 --> 00:37:40.840 +this was very much, + +00:37:41.320 --> 00:37:41.820 +there are many different, + +00:37:43.660 --> 00:37:44.660 +it demos very well. + +00:37:44.940 --> 00:37:47.400 +So the idea is that you have like a canvas + +00:37:47.600 --> 00:37:48.540 +that you can draw some points + +00:37:48.740 --> 00:37:50.660 +and you get those points back out as a data frame. + +00:37:51.800 --> 00:37:55.240 +And yes, so Vincent has a nice Marimo notebook + +00:37:55.480 --> 00:37:56.300 +that's running in the browser. + +00:37:57.240 --> 00:37:57.740 +Yeah, that's right. + +00:37:57.920 --> 00:37:58.500 +There we go. + +00:37:59.000 --> 00:38:01.120 +And I believe if you draw some points. + +00:38:01.580 --> 00:38:02.120 +I don't like the brush. + +00:38:02.280 --> 00:38:02.900 +I'm going to change the brush. + +00:38:03.340 --> 00:38:04.360 +We've got precise data. + +00:38:04.640 --> 00:38:09.960 +So you could have one kind of data set here and then you could go like, okay, we're going to do, + +00:38:10.700 --> 00:38:14.660 +you know, this is super interesting because maybe you say like the data kind of looks like this and + +00:38:14.660 --> 00:38:18.500 +I want to run an algorithm against it, but there's a ton of work to get the data in the right format. + +00:38:18.690 --> 00:38:24.460 +You could just start, you know, kind of just visually doing these things. And then you can + +00:38:24.490 --> 00:38:29.600 +go on and analyze it and do all sorts of stuff by just literally by drawing. + +00:38:30.019 --> 00:38:34.440 +Yeah. We found this is like educators are quite excited about like this kind of widget. + +00:38:34.680 --> 00:38:35.860 +because it's like, hey, here's this, + +00:38:36.640 --> 00:38:38.060 +how does a random forest algorithm work? + +00:38:38.190 --> 00:38:39.760 +It's like, okay, we can just draw a data set + +00:38:39.790 --> 00:38:40.780 +and then we can actually view + +00:38:41.610 --> 00:38:42.760 +how a classification would work + +00:38:42.850 --> 00:38:44.400 +over a specific type of data set. + +00:38:44.540 --> 00:38:47.300 +And I think it's really that type of interplay + +00:38:47.450 --> 00:38:49.080 +between something that you can play with + +00:38:49.360 --> 00:38:50.720 +to create something with the data + +00:38:50.870 --> 00:38:53.440 +and then maybe you take scikit-learn or something + +00:38:53.490 --> 00:38:54.300 +and apply an algorithm + +00:38:54.700 --> 00:38:56.260 +and you can help build intuition + +00:38:56.440 --> 00:38:57.460 +for how these algorithms work + +00:38:57.620 --> 00:39:00.900 +just by plugging in with the actual workhorse + +00:39:00.910 --> 00:39:02.380 +of doing the operations + +00:39:02.380 --> 00:39:04.560 +and like computing, like your classifier. + +00:39:04.780 --> 00:39:06.580 +But instead you have this new input, + +00:39:06.790 --> 00:39:09.180 +which is like allowing for a bit more play + +00:39:09.420 --> 00:39:10.860 +with like learning how those algorithms work. + +00:39:11.360 --> 00:39:14.920 +Yeah, you have an algorithm and you say, + +00:39:15.080 --> 00:39:17.260 +I want to see how, if the data looked like this, + +00:39:17.620 --> 00:39:18.020 +what would happen? + +00:39:18.120 --> 00:39:19.720 +If the data looked like that, what would happen? + +00:39:20.080 --> 00:39:21.680 +And a lot of times people will result + +00:39:21.840 --> 00:39:24.140 +to generating that data with code, + +00:39:24.550 --> 00:39:26.140 +either by writing it themselves, + +00:39:26.540 --> 00:39:28.960 +using a library like Faker or something like that, + +00:39:29.030 --> 00:39:30.700 +you know, like, how am I going to get it? + +00:39:30.880 --> 00:39:31.760 +What if it looks like this? + +00:39:31.940 --> 00:39:34.400 +And here you just say, whatever looks like this, and you draw it. + +00:39:34.500 --> 00:39:38.740 +And then you run your algorithm based on the output of it because it's a widget, an AnyWidget widget. + +00:39:39.180 --> 00:39:40.380 +And then you just keep going. + +00:39:40.560 --> 00:39:41.720 +It's super neat, actually. + +00:39:42.140 --> 00:39:42.440 +Yeah. + +00:39:42.980 --> 00:39:43.060 +Yeah. + +00:39:43.180 --> 00:39:47.680 +So to go back to, I think, something you mentioned before as well, + +00:39:48.460 --> 00:39:50.380 +AnyWidget sort of serves two communities. + +00:39:50.640 --> 00:39:54.800 +So I think for one, there are folks that never will ever need to touch or learn any JavaScript, + +00:39:55.060 --> 00:39:57.780 +but can just use these libraries like they would Python packages. + +00:39:58.160 --> 00:40:00.700 +So the idea is, hey, I can pip install draw data, + +00:40:00.940 --> 00:40:02.100 +and now I get to use a scatter widget. + +00:40:02.420 --> 00:40:03.600 +And I don't care at all how it works, + +00:40:03.860 --> 00:40:06.980 +but now I get to understand how this algorithm works. + +00:40:07.440 --> 00:40:08.520 +But for those that are interested, + +00:40:09.000 --> 00:40:12.240 +they can go and learn maybe a little bit of JavaScript + +00:40:12.370 --> 00:40:14.100 +or progressively a little bit more JavaScript + +00:40:15.080 --> 00:40:16.620 +to create something and package it up + +00:40:16.780 --> 00:40:18.620 +for many different platforms. + +00:40:18.940 --> 00:40:20.840 +So you sort of have people that want to make libraries + +00:40:21.010 --> 00:40:22.080 +and then you have library consumers. + +00:40:22.480 --> 00:40:25.300 +And depending on how much front-end stuff you want to learn, + +00:40:25.420 --> 00:40:27.520 +that's how deep you can go into either of those sides. + +00:40:27.740 --> 00:40:29.740 +But I'd say probably many more people + +00:40:29.760 --> 00:40:34.540 +are just widget users and happily wire them together and don't worry about it and move on + +00:40:34.570 --> 00:40:38.480 +with their day. And then there's some folks that maybe go a little bit deeper on the JS side and + +00:40:39.220 --> 00:40:43.360 +learn how to create these interactive experiences. When you say wire them together, do you mean like + +00:40:43.520 --> 00:40:48.880 +put one in a notebook in one cell and then another in another cell, but then say the input of it is + +00:40:48.890 --> 00:40:53.320 +the output of the other, that kind of wire together? Exactly. Yeah. So you can really think of these as + +00:40:53.330 --> 00:40:57.760 +like building blocks that you can like compose inside of a notebook environment. And I think + +00:40:57.760 --> 00:41:02.440 +specifically in marimo a really cool thing is that when you update that value that lives in the + +00:41:02.560 --> 00:41:07.680 +kernel that cell below will rerun so now it's not just like you do the thing and then you have to + +00:41:07.800 --> 00:41:12.920 +manually rerun the cell you actually start to build up based of our reactivity graph like a little bit + +00:41:12.920 --> 00:41:18.600 +of an application of like i select some cells and now that reruns this python code and and if i have + +00:41:18.680 --> 00:41:22.780 +more cells that run after that it all is sort of wired up from this new input which is your widget + +00:41:23.320 --> 00:41:27.340 +Yeah, I kind of riffed on this a little bit when I said like the world's craziest go-to. + +00:41:27.440 --> 00:41:30.540 +And I said the second craziest because the most crazy is Excel. + +00:41:31.140 --> 00:41:37.820 +But that's because the way Marimo solves that is it actually understands the way these variables + +00:41:38.060 --> 00:41:38.840 +used across cells. + +00:41:38.900 --> 00:41:40.960 +So when I move a widget in one, it redraws. + +00:41:41.040 --> 00:41:44.480 +It's like, oh, the thing that depended upon it also now has to change. + +00:41:44.580 --> 00:41:48.200 +And then like you can, it cascades that execution across the widgets, right? + +00:41:48.640 --> 00:41:48.960 +Exactly. + +00:41:49.280 --> 00:41:54.080 +And that would normally just happen if you use our built-in UI elements or if you rerun cells. + +00:41:54.900 --> 00:42:01.700 +And because it's just this reactive model where we trace the dependency, we just track what the dependencies are statically between your cells. + +00:42:02.500 --> 00:42:07.680 +We know that when you update this property on a widget, we know it sells that you have to update that depend on that widget. + +00:42:08.000 --> 00:42:13.280 +And so, yeah, it just sort of all falls out of this simple idea of your notebooks are a graph. + +00:42:13.400 --> 00:42:15.880 +They're not just like this manual. + +00:42:16.700 --> 00:42:20.860 +I think of it as like a lot of like running cells is like manual callbacks. + +00:42:20.960 --> 00:42:24.620 +Like you, you like do something and you're like, oh, I have to click a callback and run it. + +00:42:24.860 --> 00:42:27.880 +And by modeling it as a graph, we can run those for you. + +00:42:28.220 --> 00:42:35.540 +And so that we know exactly what needs to update when this dependency of that part of the graph is invalidated. + +00:42:35.680 --> 00:42:42.740 +I've talked to some data scientists who are like, that is both a feature and a problem that you can just be so freeform in Jupyter Notebooks, right? + +00:42:42.900 --> 00:42:44.780 +Like I can just iterate and play. + +00:42:44.860 --> 00:42:48.060 +And I think while you're iterating and playing, that freedom is great. + +00:42:48.160 --> 00:42:53.080 +But as soon as you want to start making decisions, then it becomes like a real danger point, right? + +00:42:53.460 --> 00:42:56.100 +Yeah, I think there's kind of a nice balance. + +00:42:56.360 --> 00:43:04.940 +So specifically in Marimo, there are some requirements for how you have to write your Python code that are slightly more limiting than within a Jupyter environment. + +00:43:05.200 --> 00:43:10.280 +And one that often trips folks up is that you can't redeclare variables across cells or else it would be ambiguous. + +00:43:11.060 --> 00:43:14.800 +Like, is this the cell that defines X or is this the cell that defines X? + +00:43:15.640 --> 00:43:16.880 +But I like to think of it as- + +00:43:16.880 --> 00:43:18.060 +Which order did you run it in? + +00:43:18.060 --> 00:43:19.080 +That's the one. + +00:43:19.560 --> 00:43:20.280 +Yeah, exactly. + +00:43:20.280 --> 00:43:21.740 +That's tricky. + +00:43:21.740 --> 00:43:23.500 +So we're fairly strict on that. + +00:43:23.500 --> 00:43:27.760 +But what we believe is if you buy into this constraint, then we can give you all these + +00:43:27.760 --> 00:43:33.740 +properties, which are deterministic execution order and the ability to turn these scripts + +00:43:34.860 --> 00:43:39.620 +into reproducible, or turn these sort of open-ended workflows into more reproducible artifacts + +00:43:39.620 --> 00:43:42.140 +that have a deterministic execution order. + +00:43:43.280 --> 00:43:44.880 +And so I like to think of it a little bit as like, + +00:43:45.110 --> 00:43:47.060 +I used to type Python code without type hints. + +00:43:47.620 --> 00:43:48.740 +Then I started using type hints, + +00:43:48.810 --> 00:43:51.500 +and now I can't imagine not having any autocomplete. + +00:43:51.770 --> 00:43:53.620 +And now I think like maybe our type, + +00:43:53.940 --> 00:43:55.800 +like when you start working with data in long-lived sessions, + +00:43:56.040 --> 00:43:58.600 +having some of these guardrails actually help you keep on track + +00:43:59.180 --> 00:44:00.900 +such that if you accidentally delete a variable, + +00:44:01.680 --> 00:44:02.980 +we'll let you know that you deleted it, + +00:44:03.050 --> 00:44:05.060 +and it's not something that the next time you boot up the notebook, + +00:44:05.240 --> 00:44:06.280 +you're just missing that variable. + +00:44:06.780 --> 00:44:08.840 +So it's something that you buy into, + +00:44:09.080 --> 00:44:10.820 +but then I think has all these nice consequences + +00:44:10.970 --> 00:44:12.260 +that come out that-- + +00:44:12.420 --> 00:44:14.040 +- As a newbie, you'll see like, + +00:44:14.480 --> 00:44:15.360 +these things are not defined, + +00:44:15.370 --> 00:44:16.460 +or this library doesn't exist. + +00:44:16.760 --> 00:44:17.680 +And you know what it does exist, + +00:44:17.880 --> 00:44:20.060 +you just skipped running the top import cell, + +00:44:20.330 --> 00:44:22.300 +or you know, that stuff's a little frustrating. + +00:44:23.080 --> 00:44:25.100 +So during this talk you talked about, + +00:44:25.380 --> 00:44:26.660 +during your talk previously, + +00:44:26.930 --> 00:44:29.960 +you talked about like how anywidget allows you to write + +00:44:30.380 --> 00:44:31.240 +just enough JavaScript. + +00:44:32.590 --> 00:44:33.300 +What do you mean by that? + +00:44:33.380 --> 00:44:34.880 +- What do I mean by just enough JavaScript? + +00:44:35.180 --> 00:44:38.180 +- Yeah, like, so one person's just enough JavaScript, + +00:44:38.460 --> 00:44:40.520 +another person's like, whoa, way too much JavaScript. + +00:44:41.040 --> 00:44:41.320 +Sure. + +00:44:41.680 --> 00:44:44.940 +So when I first started learning any front-end code, + +00:44:45.440 --> 00:44:47.680 +my experience was I opened up an HTML file + +00:44:47.800 --> 00:44:49.700 +and I just wrote some JavaScript on the page. + +00:44:49.920 --> 00:44:51.760 +And I completely love this workflow + +00:44:52.040 --> 00:44:54.900 +for playing and learning how to write code + +00:44:54.900 --> 00:44:56.640 +for the first time in the browser. + +00:44:57.360 --> 00:44:58.280 +But I think over the years, + +00:44:59.440 --> 00:45:01.260 +as people have started to build things + +00:45:01.580 --> 00:45:05.020 +like Figma and Adobe in the browser, + +00:45:05.240 --> 00:45:07.200 +there's a ton of tooling that has come up + +00:45:07.180 --> 00:45:09.240 +to help build those types of applications. + +00:45:09.920 --> 00:45:13.040 +So many people's first experience with writing JavaScript + +00:45:13.220 --> 00:45:15.440 +is someone telling them that they need to learn React + +00:45:15.720 --> 00:45:17.180 +or learn a specific framework. + +00:45:18.460 --> 00:45:20.980 +And then it's a bit like, well, this isn't technically JavaScript. + +00:45:21.200 --> 00:45:23.200 +This is a flavor of JavaScript that we transform. + +00:45:24.160 --> 00:45:27.220 +And so the experience that I wanted for anywidget + +00:45:27.500 --> 00:45:30.860 +was to write this standardized JavaScript code + +00:45:31.000 --> 00:45:33.420 +as sort of the entry point to your widgets. + +00:45:34.020 --> 00:45:37.140 +So it should feel as simple as you could just open up + +00:45:37.140 --> 00:45:39.260 +browser console and start typing in code. + +00:45:39.660 --> 00:45:42.820 +And the code that you write is exactly what the browser understands. + +00:45:43.200 --> 00:45:45.620 +And so it should be dead simple from the beginning. + +00:45:46.780 --> 00:45:51.380 +Just learning some APIs and probably pattern matching between, okay, this is some syntax. + +00:45:51.620 --> 00:45:53.000 +But it should feel pretty familiar. + +00:45:53.540 --> 00:45:57.440 +Should you want to learn those frameworks or use those frameworks, those all have the + +00:45:57.620 --> 00:46:01.520 +ability to be transformed into this standardized JavaScript. + +00:46:01.620 --> 00:46:03.800 +And in fact, they have to be in order to run in the browser. + +00:46:03.980 --> 00:46:04.740 +That's how that all works. + +00:46:05.540 --> 00:46:15.020 +So I wanted that initial experience to be, if you want to learn and you've never tried it out before, send a variable and you can console log that variable and it just works. + +00:46:15.540 --> 00:46:23.000 +But should you want to build something very ambitious and a lot of our most popular complicated widgets do use some of these frameworks, that's something that you can learn. + +00:46:23.400 --> 00:46:30.800 +So I could do a view or an AlpineJS or I could do a React and all of that kind of stuff if I want. + +00:46:30.840 --> 00:46:40.280 +Yeah, so in our documentation, there is like a, I think it's, I call it, we call it, we have like a little CLI to bootstrap a project that's like ready to be used with React. + +00:46:41.940 --> 00:46:46.680 +And I think we had a contributor contribute one for Vue and Svelte as well. + +00:46:47.780 --> 00:46:56.420 +And the whole idea there is like, yes, in order to, like, the tradeoff is that now you are introducing like some JavaScript build tooling into your, like into this process. + +00:46:56.780 --> 00:47:01.920 +But hopefully if you're familiar with those frameworks, that isn't like a big overhead to trying out some of those things. + +00:47:02.980 --> 00:47:10.260 +Versus like for the Python beginner that wants to learn something in the front end and they're just trying to get their data into the front end. + +00:47:10.380 --> 00:47:15.400 +I just don't want them to have to worry about TypeScript or React or any of these things that they might hear about. + +00:47:15.520 --> 00:47:22.860 +And instead they can just get started with trying to paint some pixels on the screen and then progressively learn outside of that ecosystem. + +00:47:22.880 --> 00:47:27.060 +So the entry point will always be sort of this simple front end standard. + +00:47:27.470 --> 00:47:32.360 +And then how far you want to go into that ecosystem, then you can experience more of + +00:47:32.390 --> 00:47:37.020 +like the tooling there to help with reactivity and things in the front end. + +00:47:37.320 --> 00:47:41.800 +Yeah, that's super interesting to think of injecting little miniature view applications + +00:47:42.420 --> 00:47:44.220 +and stuff to allow that work. + +00:47:44.250 --> 00:47:48.180 +But they are super powerful if you're willing to go through the build steps and all the + +00:47:48.900 --> 00:47:52.420 +hoops that those different frameworks ask you to do to get it to run. + +00:47:52.460 --> 00:47:57.240 +And once you get it set up, it's like, okay, these things all bi-directionally data bind to each other internally. + +00:47:57.460 --> 00:48:04.320 +And so I can see how that plays just like perfectly naturally with the already binding of the dynamic interaction here. + +00:48:04.720 --> 00:48:05.340 +Yeah, exactly. + +00:48:05.760 --> 00:48:13.900 +So we model this as like we have the standard, which is like this ECMAScript standard, which is the JavaScript code that you write. + +00:48:14.620 --> 00:48:18.140 +And then all of the libraries are modeled as like adapters on top of that standard. + +00:48:18.340 --> 00:48:19.260 +I think we call them bridges. + +00:48:19.820 --> 00:48:22.500 +So in React, you get hooks that you can call, + +00:48:22.650 --> 00:48:25.680 +and then you're writing code that looks like idiomatic React, + +00:48:25.940 --> 00:48:28.400 +but behind the scenes, it's calling anywidget APIs. + +00:48:29.370 --> 00:48:32.300 +And our Vue bridge does the same thing, but with Vue APIs. + +00:48:32.530 --> 00:48:36.240 +So you get to write front-end code that feels like it's Vue-like or React-like, + +00:48:36.360 --> 00:48:38.860 +but behind the scenes, you have these custom bridges + +00:48:38.970 --> 00:48:40.860 +that are written by folks that are familiar with those frameworks + +00:48:41.330 --> 00:48:43.680 +for how they should interface with our standard specification. + +00:48:44.120 --> 00:48:44.680 +Yeah, wild. + +00:48:45.100 --> 00:48:46.620 +Let's talk about not using one of those, + +00:48:46.800 --> 00:48:50.620 +but instead talking through building a simple widget. + +00:48:50.840 --> 00:48:52.300 +Now, just to be clear, + +00:48:52.440 --> 00:48:54.960 +I know we can't read all of the code audio version + +00:48:55.180 --> 00:48:56.620 +because most people just listen to the show, + +00:48:57.080 --> 00:48:59.520 +but maybe just give us a sense going through this. + +00:48:59.790 --> 00:49:01.540 +You have a build a counter widget. + +00:49:02.060 --> 00:49:02.160 +Yeah. + +00:49:02.380 --> 00:49:03.860 +Or I can click a button and it counts or something. + +00:49:04.640 --> 00:49:05.320 +Yeah, just walk us through, + +00:49:05.910 --> 00:49:07.380 +just to give people a sense of what does it mean + +00:49:07.430 --> 00:49:08.820 +to build one of these AnyWidget widgets. + +00:49:09.220 --> 00:49:11.140 +Yeah, so the idea within AnyWidget + +00:49:11.170 --> 00:49:14.580 +is that you create a class that extends from a single class + +00:49:15.340 --> 00:49:17.240 +that's in anywidget called anywidget. + +00:49:17.700 --> 00:49:20.520 +And that is what you define both your view, + +00:49:20.700 --> 00:49:22.320 +so the front end part of that code, + +00:49:22.900 --> 00:49:24.960 +and then also the back end code for that. + +00:49:25.220 --> 00:49:27.540 +So it fully encapsulates the idea of a widget + +00:49:27.680 --> 00:49:29.800 +that has both a back end and a front end component. + +00:49:30.300 --> 00:49:32.220 +Yeah, and just to be clear for people, + +00:49:32.300 --> 00:49:33.460 +let's say when you say a class, + +00:49:33.700 --> 00:49:36.760 +it's not one of those weird prototype JavaScript type things. + +00:49:36.860 --> 00:49:39.680 +It's the more modern class keyword, + +00:49:40.300 --> 00:49:41.640 +but in ECMAScript, is that right? + +00:49:41.920 --> 00:49:42.380 +Oh, sorry. + +00:49:42.680 --> 00:49:46.080 +So right, what we're looking at here is just a Python class that's extending. + +00:49:46.080 --> 00:49:47.340 +Oh, this is a Python class. + +00:49:47.470 --> 00:49:47.600 +Okay. + +00:49:47.920 --> 00:49:48.000 +Yeah. + +00:49:48.080 --> 00:49:48.720 +A Python class. + +00:49:48.970 --> 00:49:49.080 +Great. + +00:49:49.210 --> 00:49:51.980 +And so you create a Python class, but then it has the JavaScript. + +00:49:52.540 --> 00:49:52.760 +Exactly. + +00:49:53.190 --> 00:49:57.580 +So ESM is like a private field that defines a front end module. + +00:49:58.120 --> 00:50:02.280 +And the whole idea there is that you define a function that's called render that takes + +00:50:02.380 --> 00:50:05.280 +in two arguments, a model and an element. + +00:50:05.690 --> 00:50:08.760 +So the model is like this object that talks back to the kernel. + +00:50:09.200 --> 00:50:14.980 +And an element is whatever the front end, like Jupyter or VS Code gives you as like the output on the screen. + +00:50:15.370 --> 00:50:21.340 +And so those two things combined, you can call methods on that model to set a value or get a value. + +00:50:21.770 --> 00:50:23.660 +And then you can update some part of the UI. + +00:50:24.640 --> 00:50:25.040 +I see. + +00:50:25.050 --> 00:50:29.140 +So you're past like a DOM element, which might be a div or whatever. + +00:50:29.330 --> 00:50:34.320 +And then you can just inject JavaScript, like a pen child and other stuff and set a class and whatever. + +00:50:34.480 --> 00:50:38.500 +And that builds out the little, the part of the DOM that you control or it's given to you? + +00:50:38.800 --> 00:50:41.640 +Exactly. And then you can style that however way that you want. + +00:50:42.620 --> 00:50:45.780 +And then the key thing is that as a part of this adapter, + +00:50:46.460 --> 00:50:49.300 +when you call methods on this model object that you get, + +00:50:49.740 --> 00:50:51.600 +I think there's only a few methods that are on it, + +00:50:51.600 --> 00:50:53.260 +like get, set, and save changes. + +00:50:53.920 --> 00:50:56.340 +And that just synchronizes values back and forth + +00:50:56.620 --> 00:50:57.900 +between the front end and the back end. + +00:50:58.520 --> 00:51:01.880 +And that same sort of API is on the Python side as well. + +00:51:02.040 --> 00:51:04.080 +And so you can react to those values as they update + +00:51:04.340 --> 00:51:06.560 +or set them as well on the Python side. + +00:51:07.540 --> 00:51:09.300 +And then we'll deal for the most part, + +00:51:09.460 --> 00:51:12.420 +like with serialization of simple data types to one. + +00:51:12.420 --> 00:51:13.960 +So if you have a float or an int in Python, + +00:51:14.100 --> 00:51:15.760 +we'll just map that to a number data type. + +00:51:16.420 --> 00:51:17.700 +We have that mapping in our documentation. + +00:51:18.340 --> 00:51:20.500 +If you want to serialize to something like JSON, + +00:51:20.800 --> 00:51:21.520 +that's totally fine too. + +00:51:22.920 --> 00:51:24.360 +And then you just send that over the wire. + +00:51:24.720 --> 00:51:25.700 +Yeah, that's wild. + +00:51:25.840 --> 00:51:27.020 +So how does it actually, + +00:51:27.420 --> 00:51:28.660 +so we've got a Python class, + +00:51:28.840 --> 00:51:30.480 +which I guess probably lives in the kernel. + +00:51:31.400 --> 00:51:34.440 +And then it's got HTML JavaScript stuff it's doing + +00:51:34.920 --> 00:51:35.800 +that it's setting values. + +00:51:37.140 --> 00:51:38.880 +What's this exchange? + +00:51:39.350 --> 00:51:41.140 +Where do different pieces run? + +00:51:41.350 --> 00:51:42.620 +How do they communicate? + +00:51:43.020 --> 00:51:43.280 +Sure. + +00:51:43.540 --> 00:51:45.300 +So that is up to the implementation. + +00:51:45.300 --> 00:51:48.100 +So whoever builds an adapter for anywidget. + +00:51:48.520 --> 00:51:52.820 +But the idea is that the front-end code that you write is in the standardized form. + +00:51:52.950 --> 00:51:57.200 +So you can just call import and import this JavaScript module that you have. + +00:51:57.900 --> 00:52:03.059 +And then the front-end is responsible for when these different methods get called on model + +00:52:03.060 --> 00:52:08.540 +for sending that data back and communicating over some sort of mechanism to the backend. + +00:52:09.160 --> 00:52:11.300 +Jupyter already has defined something like that. + +00:52:11.300 --> 00:52:13.700 +Yeah, so Jupyter has a notion of something called a com, + +00:52:14.140 --> 00:52:17.900 +and that is implemented over a web socket to communicate back and forth. + +00:52:18.400 --> 00:52:21.040 +And similar in Marimo, we're not based on Jupyter at all, + +00:52:21.380 --> 00:52:25.560 +but our any-witted adapter just talks over a web socket with some different messages. + +00:52:26.060 --> 00:52:30.900 +But basically, we call this a host platform that takes care of that implementation + +00:52:31.320 --> 00:52:32.580 +and wires up those two ends. + +00:52:32.940 --> 00:52:34.960 +But then as the widget author, you just care about like, + +00:52:35.280 --> 00:52:37.400 +here's my JavaScript module, here's my Python code. + +00:52:37.690 --> 00:52:38.600 +And they use these APIs. + +00:52:39.030 --> 00:52:41.780 +And then it's someone else's problem to figure out how to wire those up together. + +00:52:42.140 --> 00:52:42.940 +Yeah, super neat. + +00:52:43.220 --> 00:52:45.000 +Kasaasa, interesting thought here. + +00:52:45.060 --> 00:52:47.500 +I'd be super interested to see this working purely in Python. + +00:52:48.080 --> 00:52:49.200 +Front end running in Wasm. + +00:52:49.560 --> 00:52:51.420 +No need for JavaScript streams in Python. + +00:52:51.830 --> 00:52:52.400 +What do you think here? + +00:52:52.960 --> 00:52:53.360 +Yeah, yeah. + +00:52:53.590 --> 00:53:00.300 +I haven't played around with some of the PyDide or the PyScript APIs. + +00:53:00.840 --> 00:53:02.360 +I think that that would probably be possible. + +00:53:02.980 --> 00:53:05.000 +the example that you had of draw data before + +00:53:05.970 --> 00:53:08.660 +for the Marimo notebook that Vincent had + +00:53:09.060 --> 00:53:10.760 +that is entirely running in the browser + +00:53:11.030 --> 00:53:15.160 +so that is like the Marimo Python kernel is running in Wasm + +00:53:15.170 --> 00:53:17.240 +and then we're also loading the JavaScript code + +00:53:17.390 --> 00:53:20.740 +so we do have folks that just create a Marimo notebook + +00:53:21.480 --> 00:53:23.300 +and then compile it to WebAssembly + +00:53:23.450 --> 00:53:27.320 +and then they have a static website that has both of these bits together + +00:53:27.560 --> 00:53:29.780 +Okay, that's actually pretty wild, isn't it? + +00:53:31.040 --> 00:53:35.640 +So I've seen that there's some other places that I can, anywidget, like Myst. + +00:53:36.180 --> 00:53:36.600 +Yeah. + +00:53:37.460 --> 00:53:38.520 +And also Mosaic. + +00:53:38.580 --> 00:53:39.180 +Tell us about these. + +00:53:39.640 --> 00:53:40.400 +Yeah, yeah, definitely. + +00:53:40.860 --> 00:53:50.140 +So Myst, I think on our community page, we have a notion of widgets, and then we have a notion of host platforms. + +00:53:50.460 --> 00:53:59.440 +And so the two primary host platforms right now are things that are Jupyter-based, so like VS Code, Notebooks, Google Colab. + +00:54:01.460 --> 00:54:05.980 +binder, like all these things that are based off of a Jupyter kernel and run Python behind the scenes. + +00:54:07.140 --> 00:54:11.780 +Then Marimo is another host platform and Mist is currently working on an integration + +00:54:12.660 --> 00:54:17.740 +around that specification for the front-end code. And so the idea is like as long as they have a way + +00:54:17.920 --> 00:54:22.880 +to, and I'm not sure how far along that is at the moment, but as long as there is a way to run Python + +00:54:23.100 --> 00:54:27.220 +code and sort of maintain that relationship between that front-end code and back-end code, then + +00:54:27.920 --> 00:54:31.840 +And you could drop anywidgets within your MIST markdown, for example, as well. + +00:54:31.880 --> 00:54:32.080 +Yeah. + +00:54:32.760 --> 00:54:33.160 +MIST is wild. + +00:54:33.200 --> 00:54:35.500 +I had that team on the podcast a while ago. + +00:54:35.640 --> 00:54:42.000 +It's like, you want to create an e-book based on notebooks or whatever, LaTeX paper, or + +00:54:42.060 --> 00:54:42.720 +you name it, right? + +00:54:42.940 --> 00:54:43.220 +Yeah. + +00:54:43.920 --> 00:54:44.580 +And yeah. + +00:54:45.060 --> 00:54:46.100 +I haven't done anything with Mosaic. + +00:54:46.560 --> 00:54:47.080 +Yeah, Mosaic. + +00:54:47.980 --> 00:54:49.620 +I think this is a really cool project. + +00:54:50.680 --> 00:54:53.680 +So at its core, Mosaic is a bit more of like an architecture. + +00:54:54.100 --> 00:54:56.360 +And this is probably one of the things that I'm most excited about, + +00:54:56.840 --> 00:54:59.380 +like the ability that anywidget is bringing to folks + +00:54:59.580 --> 00:55:02.460 +to experiment with new types of architectures + +00:55:02.800 --> 00:55:04.840 +for building high-performance visualizations. + +00:55:05.860 --> 00:55:09.980 +So inside a Mosaic, you have a notion of some database source + +00:55:10.280 --> 00:55:12.000 +that I believe is typically DuckDB. + +00:55:13.160 --> 00:55:14.700 +And then you write a bunch of front-end code + +00:55:14.740 --> 00:55:17.440 +that basically expresses all of the data that it needs + +00:55:17.440 --> 00:55:18.640 +as queries to that database. + +00:55:18.940 --> 00:55:20.920 +And then there's a lot of optimizations that can happen + +00:55:21.180 --> 00:55:24.060 +between those individual views and the database + +00:55:24.080 --> 00:55:28.080 +that can all be coordinated and allow for very scalable data visualizations. + +00:55:29.350 --> 00:55:33.740 +One cool thing about Mosaic is that this architecture lends itself very well to notebooks + +00:55:34.220 --> 00:55:38.960 +because you can keep that database, that DuckDB running completely in the kernel, + +00:55:39.580 --> 00:55:42.760 +and then you just have your front-end client. But then there's also an option where you just + +00:55:42.980 --> 00:55:48.240 +completely compile everything into WebAssembly, and then that DuckDB is running in the browser as + +00:55:48.260 --> 00:55:53.500 +well. And so you get to reuse a lot of this architecture and allows just a lot of code + +00:55:53.500 --> 00:55:57.920 +reuse and you're just kind of moving that lever of like okay do i need like high performance compute + +00:55:58.180 --> 00:56:03.060 +okay let me put my database in my hpc and then i'll just have a thin client on top of it versus + +00:56:03.520 --> 00:56:07.260 +okay actually i want to look at everything in the browser like maybe we can just compile to web + +00:56:07.440 --> 00:56:12.260 +assembly and let someone drop in a csv and then we can use reuse the same visualization so yeah + +00:56:12.320 --> 00:56:18.120 +this is super yeah that's super flexible duckdb has a very interesting WebAssembly story as well so + +00:56:18.480 --> 00:56:23.480 +yeah definitely it's definitely catching a lot of attention and people you know it's + +00:56:23.500 --> 00:56:26.780 +It's the data science SQLite, right? + +00:56:26.900 --> 00:56:27.040 +Yeah. + +00:56:27.760 --> 00:56:30.960 +And I'd say like a growing trend that I've seen by being able to, + +00:56:31.300 --> 00:56:33.080 +like one thing I didn't, that didn't fully, + +00:56:33.440 --> 00:56:35.380 +I wasn't fully aware of when I started on anywidget, + +00:56:35.500 --> 00:56:39.480 +but it has been fun to see is because it became easier to sort of start playing + +00:56:39.880 --> 00:56:42.280 +with like both front end code and Python code in the same environment. + +00:56:42.820 --> 00:56:46.100 +Like there've been a lot more, I think, like experimentations around, + +00:56:46.240 --> 00:56:50.520 +like trying out new types of architectures for building like visualizations. + +00:56:50.640 --> 00:56:53.460 +And so one trend that I've noticed in our community gallery + +00:56:53.480 --> 00:56:55.140 +is that a lot of our most popular widgets + +00:56:55.720 --> 00:56:59.660 +are taking advantage of new data formats like Apache Arrow + +00:57:00.200 --> 00:57:02.440 +to be able to send lots of data to the browser + +00:57:02.560 --> 00:57:04.740 +to put on the GPU and render very quickly. + +00:57:05.260 --> 00:57:08.720 +And so basically, there's a really cool example + +00:57:08.780 --> 00:57:10.120 +of a widget called Lawnboard, + +00:57:10.480 --> 00:57:12.700 +which is a geospatial visualization widget, + +00:57:13.240 --> 00:57:16.300 +which is a wrapper around a front-end library called DeckGL + +00:57:16.300 --> 00:57:19.420 +that was made by Uber to do geospatial visualization. + +00:57:20.080 --> 00:57:21.700 +And the previous Jupyter integration, + +00:57:22.380 --> 00:57:27.060 +like serialized data to JSON to be able to like render inside of DeckGL. + +00:57:27.220 --> 00:57:29.380 +So the moment that you had to render maybe a few million points, + +00:57:29.540 --> 00:57:31.740 +it's like you're spending a ton of time serializing that data + +00:57:31.900 --> 00:57:33.380 +to be able to put it in the browser. + +00:57:33.800 --> 00:57:33.880 +Right. + +00:57:34.300 --> 00:57:35.400 +It's like using an ORM. + +00:57:35.520 --> 00:57:38.020 +They're great until you try to do a query with 100,000 records. + +00:57:38.180 --> 00:57:39.220 +You're like, why is this so slow? + +00:57:39.619 --> 00:57:39.940 +Exactly. + +00:57:40.180 --> 00:57:42.540 +And so I think there's an example with Lawnboard where + +00:57:44.140 --> 00:57:47.340 +some data set with like 3.5 million rows just like crashed in the previous, + +00:57:48.019 --> 00:57:50.340 +or never rendered, maybe it took a few seconds to render, + +00:57:51.080 --> 00:57:52.380 +or a minute to render. + +00:57:53.720 --> 00:57:55.640 +But Lawnboard uses Apache Arrow + +00:57:55.960 --> 00:57:58.960 +and grabs that data from either Arrow, Parquet, + +00:57:59.040 --> 00:58:00.260 +or all these different file formats. + +00:58:00.840 --> 00:58:01.940 +And that's very fast to do, + +00:58:02.360 --> 00:58:04.820 +because you just copy that buffer and ship it to the front end + +00:58:04.850 --> 00:58:05.700 +and put it on the GPU. + +00:58:06.320 --> 00:58:08.040 +And so it can do that same example in, I think, + +00:58:08.610 --> 00:58:10.800 +one or two seconds on modern machines. + +00:58:11.130 --> 00:58:13.660 +And so it really is this, get your data into a data frame + +00:58:13.770 --> 00:58:15.760 +and dump it on the GPU and visualize it. + +00:58:16.800 --> 00:58:18.420 +And that whole idea of, + +00:58:18.620 --> 00:58:23.660 +oh, what format do I need to convert to to be able to use this tool, this visualization tool + +00:58:23.760 --> 00:58:27.860 +that I've heard? It's like, no, no, no, just load it as a data frame. And then as long as you get + +00:58:27.860 --> 00:58:33.580 +your data as a data frame, we can throw it into the web browser. Yeah, super neat. That's the kind + +00:58:33.630 --> 00:58:37.240 +of stuff that normal web people don't think about, right? You're like, well, how much is it to + +00:58:37.520 --> 00:58:41.660 +serialize a little JSON response? Or, you know, we're talking about a million of them. Oh. + +00:58:42.540 --> 00:58:48.420 +Yeah, yeah, exactly. And I guess one thing I've been impressed with is just at times where my + +00:58:48.480 --> 00:58:54.340 +understanding of how good computers have gotten is like, it's just shocking. It's more like, oh, + +00:58:54.380 --> 00:58:59.320 +I think we're holding ourselves back at times by maybe some of the standard practices versus like, + +00:59:00.240 --> 00:59:03.580 +yeah, hardware has gotten very good. And if you can make use of that hardware efficiently, + +00:59:04.820 --> 00:59:09.020 +there's quite a bit that you can do on very low powered devices and visualize a ton of data. + +00:59:10.000 --> 00:59:15.120 +Yeah. Quite a while ago, like really long time ago, I did some OpenGL programming and you see, + +00:59:15.180 --> 00:59:19.740 +you know, like here, we gave this, the scene a hundred thousand triangles and it's rendering + +00:59:19.940 --> 00:59:25.600 +at 200 frames a second. You're like, how, how many operations is that? Like, that is insane. You know, + +00:59:25.680 --> 00:59:29.060 +this is like 30 years ago. You were just like, that just blows my mind. Yeah. I think people + +00:59:29.300 --> 00:59:32.920 +underestimate what computers can do sometimes, especially if you work with them, right? You're + +00:59:32.920 --> 00:59:37.720 +not doing three levels of serialization and deserialization and, and so on. Yeah. Yeah, + +00:59:37.980 --> 00:59:42.240 +exactly. That's wild. So what about publishing one of these things? If we go and write, + +00:59:42.620 --> 00:59:51.000 +I've created an AnyWidget and it's amazing for me to use, but I want to put it on PyPI so people can UVPIP install it or whatever in their notebook. + +00:59:51.400 --> 00:59:51.640 +Yeah. + +00:59:52.310 --> 00:59:52.960 +So I have a couple. + +00:59:53.540 --> 01:00:02.660 +One, I would recommend if folks are really interested in it, there's a couple of videos I have on my YouTube of just like publishing some widgets and they're linked in the AnyWidget documentation. + +01:00:03.240 --> 01:00:08.000 +But that file that you see there, that's open in the current view that we have. + +01:00:09.020 --> 01:00:12.220 +If you know how to publish a Python file, you can just publish that to PyPI. + +01:00:12.580 --> 01:00:13.280 +and it should just work. + +01:00:13.700 --> 01:00:17.080 +So yeah, the one caveat I'd have is like, + +01:00:17.220 --> 01:00:18.740 +you're looking at an inline string here, + +01:00:18.940 --> 01:00:20.120 +and that's not always the nicest way + +01:00:20.300 --> 01:00:21.920 +to write your front end code. + +01:00:22.420 --> 01:00:23.540 +So you can also just import, + +01:00:23.900 --> 01:00:26.640 +like you can require that as a file instead. + +01:00:26.820 --> 01:00:28.320 +And so you can put that code into a separate file, + +01:00:28.780 --> 01:00:31.680 +type check that or like lint it or like format it + +01:00:31.700 --> 01:00:32.180 +as JavaScript. + +01:00:32.740 --> 01:00:34.220 +- So write a JavaScript file, + +01:00:34.480 --> 01:00:35.800 +and then somehow set just, + +01:00:36.020 --> 01:00:38.140 +would I just read that in Python? + +01:00:39.120 --> 01:00:40.640 +And then just set the text result, + +01:00:40.820 --> 01:00:43.240 +like use path and say read text and then jam it in there? + +01:00:43.680 --> 01:00:46.300 +Yeah, we support if you just pass a path, + +01:00:46.410 --> 01:00:48.840 +lib path to ESM, we'll read it for you. + +01:00:49.320 --> 01:00:50.580 +Oh, okay, perfect, yeah. + +01:00:51.180 --> 01:00:53.060 +So as long as I have a package + +01:00:53.090 --> 01:00:56.580 +and the package points out that these JavaScript files + +01:00:56.780 --> 01:00:58.980 +are to be included at build, then you're good to go? + +01:00:59.520 --> 01:01:01.280 +Yep, and our starter kit template + +01:01:01.500 --> 01:01:04.600 +is all configured with Hatchling to do that for you. + +01:01:04.980 --> 01:01:05.600 +Yeah, very nice. + +01:01:05.990 --> 01:01:10.340 +Okay, and I've been using uv build, uv publish these days, + +01:01:10.520 --> 01:01:11.720 +and it's pretty seamless. + +01:01:12.160 --> 01:01:12.660 +Yeah, yeah. + +01:01:12.720 --> 01:01:15.300 +I think maybe we might need to update the build script + +01:01:15.330 --> 01:01:16.760 +to have uv build as the backend. + +01:01:17.220 --> 01:01:17.640 +So, yeah. + +01:01:17.820 --> 01:01:19.820 +Yeah, even if you do uv build against something + +01:01:19.870 --> 01:01:20.640 +with a different backend, + +01:01:20.710 --> 01:01:22.540 +I think uv just runs that backend instead. + +01:01:22.770 --> 01:01:24.260 +So not a big deal. + +01:01:24.530 --> 01:01:25.020 +Not a big deal. + +01:01:25.440 --> 01:01:27.460 +So what are maybe some of the rough edges + +01:01:27.680 --> 01:01:30.140 +that people need to look out for these days? + +01:01:30.720 --> 01:01:32.400 +It's got a lot of stuff that makes things easy, + +01:01:32.640 --> 01:01:34.760 +but what should they be maybe on the lookout for? + +01:01:35.120 --> 01:01:39.060 +I think the thing that if folks are getting, + +01:01:39.580 --> 01:01:44.100 +Hopefully if you want to try out a widget, it should be as simple as installing any other Python package. + +01:01:44.340 --> 01:01:52.480 +And that is really a goal that we've had for the project, is that if you want to use this like you would some algorithm library that just does computation, + +01:01:52.840 --> 01:01:56.440 +we want that to be as simple as pip installing and getting started. + +01:01:57.340 --> 01:02:04.200 +I would say probably the highest barrier to entry is that it is just this idea of just enough JavaScript. + +01:02:04.640 --> 01:02:10.100 +And so one thing that I try to emphasize in the docs is that I'd really start with a very + +01:02:10.290 --> 01:02:15.660 +simple example if you've never played around in the browser and get used to opening up + +01:02:15.740 --> 01:02:20.860 +your developer tools and logging things and just understanding how to develop maybe a + +01:02:20.940 --> 01:02:21.760 +little bit in the front end. + +01:02:22.120 --> 01:02:25.160 +If you're coming from the front end side, then it would be learning a little bit more + +01:02:25.170 --> 01:02:29.560 +of how does Python work and how do I debug things on the Python side. + +01:02:29.660 --> 01:02:32.440 +And what is PyPI and how do I even work with that thing? + +01:02:32.700 --> 01:02:37.060 +Exactly. So it's really, I'd say, most of the rough edges are just these problems of working + +01:02:37.120 --> 01:02:41.860 +with two language ecosystems. But that's also been a very important part of AnyWidget because we don't + +01:02:41.940 --> 01:02:46.720 +want people that are ambitious to build things that sort of bridge this gap to feel restricted + +01:02:46.940 --> 01:02:51.320 +from either using things in the Python ecosystem or the web ecosystem. Because we could always write + +01:02:51.480 --> 01:02:56.320 +wrappers to make things easier for authors, but then we'll probably get a bunch of issues that + +01:02:56.320 --> 01:03:00.440 +then say, hey, I want to use this API and you haven't exposed it. And they're like, hey, we'd like to use + +01:03:00.460 --> 01:03:03.080 +view, you're like, well, you use our Python thing that defines the DOM. + +01:03:03.140 --> 01:03:03.760 +Like, oh, okay. + +01:03:04.320 --> 01:03:05.100 +Yeah, exactly. + +01:03:05.880 --> 01:03:11.020 +So with great power comes great responsibility, but we want to give that to people that want + +01:03:11.020 --> 01:03:12.340 +to build these types of integrations. + +01:03:12.460 --> 01:03:12.600 +Yeah. + +01:03:13.300 --> 01:03:16.320 +So what about agentic AI for this kind of stuff? + +01:03:16.460 --> 01:03:20.260 +I know five years ago, you have to learn every step of the JavaScript. + +01:03:20.420 --> 01:03:25.860 +How much can I say, take my inline JavaScript, put it into a JS file, and then point it at, + +01:03:26.180 --> 01:03:30.300 +you know, ask Claude Sonnet, hey, this is an AnyWidget widget, and this is the JavaScript. + +01:03:30.520 --> 01:03:31.560 +and here's what I'm trying to accomplish. + +01:03:31.900 --> 01:03:33.540 +I bet you could go pretty far + +01:03:34.000 --> 01:03:36.280 +with a marginal understanding of JavaScript + +01:03:36.700 --> 01:03:39.140 +as a Python person, plus some kind of AI. + +01:03:39.250 --> 01:03:39.740 +What do you think? + +01:03:39.960 --> 01:03:41.040 +Oh, yeah, absolutely. + +01:03:41.330 --> 01:03:43.780 +I think, like I mentioned earlier, + +01:03:43.920 --> 01:03:46.260 +Vincent works and I work very closely together. + +01:03:46.760 --> 01:03:48.280 +And his joke that he likes to tell + +01:03:48.310 --> 01:03:49.680 +is that I created anywidgets + +01:03:49.860 --> 01:03:51.020 +so that he could make anywidgets. + +01:03:53.040 --> 01:03:55.620 +And the ability to vibe code + +01:03:55.760 --> 01:03:58.220 +sort of the front end for an object + +01:03:58.500 --> 01:04:00.260 +really is eye-opening if you've never tried it. + +01:04:00.340 --> 01:04:02.100 +before if you're working with your data. + +01:04:02.330 --> 01:04:05.260 +And I think one of the barriers maybe of trying out + +01:04:05.560 --> 01:04:09.740 +or of thinking about how to use anywidget + +01:04:09.830 --> 01:04:11.120 +is knowing when it might be useful. + +01:04:11.330 --> 01:04:13.100 +And really, it's like anytime you have an object + +01:04:13.300 --> 01:04:15.720 +that you might want to visualize in a particular way + +01:04:16.130 --> 01:04:19.720 +or maybe extend with some extra capabilities + +01:04:20.200 --> 01:04:21.340 +in your notebook environment, + +01:04:21.760 --> 01:04:25.620 +that's a perfect time to open up some agentic tool + +01:04:25.750 --> 01:04:28.460 +and start trying to explain what you're thinking about. + +01:04:29.200 --> 01:04:34.720 +because it's just JavaScript and not like a custom framework that we wrote or, and it's not like, + +01:04:35.080 --> 01:04:39.320 +they're very good at like writing this type of code. And so sometimes you might need to, + +01:04:39.420 --> 01:04:43.260 +we have a set of rules to try to like help it, make sure it doesn't like try to do React or + +01:04:43.380 --> 01:04:48.460 +something, but it can go very far. And because it's just like a view on top of like your data, + +01:04:48.720 --> 01:04:52.260 +it's like, it's kind of the perfect thing to play around with vibe coding, I think is. + +01:04:52.640 --> 01:04:56.760 +It's incredible how far you can go with this stuff. You can, you can give it your data source + +01:04:56.780 --> 01:04:57.740 +and say, here's the data source, + +01:04:57.900 --> 01:04:58.660 +here's what I'm trying to do. + +01:04:58.790 --> 01:05:00.780 +And because the data sources often can be + +01:05:01.030 --> 01:05:02.720 +really foundational like CSV or whatever. + +01:05:02.990 --> 01:05:05.860 +And then importantly, because it's just pure JavaScript, + +01:05:06.520 --> 01:05:10.300 +AI goes crazy on pure JavaScript and pure Python, right? + +01:05:10.480 --> 01:05:11.620 +It's unleashed. + +01:05:11.820 --> 01:05:12.400 +Yeah, it's crazy. + +01:05:12.720 --> 01:05:16.100 +So to plug it back to Marimo for a second, + +01:05:16.290 --> 01:05:19.560 +I think because it's just Python and just JavaScript, + +01:05:19.760 --> 01:05:20.800 +it's a really fun environment + +01:05:21.080 --> 01:05:22.660 +to actually create anywidgets in. + +01:05:22.860 --> 01:05:26.740 +And we have some integrations with like AI chat + +01:05:26.760 --> 01:05:32.680 +within our editor where we actually will inspect some of your Python variables and include that + +01:05:32.810 --> 01:05:34.620 +in the prompts as you're trying to work on things. + +01:05:34.740 --> 01:05:38.380 +So if you're saying, "Hey, I want to make a widget and I want to visualize this data + +01:05:38.580 --> 01:05:42.800 +frame," you can actually tag the object that you want to start programming around. + +01:05:43.270 --> 01:05:46.680 +And that will include things like the schema and stuff as it gives it to the model, such + +01:05:46.920 --> 01:05:50.680 +that then you get that sort of grounds it at least when it's writing that JavaScript code + +01:05:50.750 --> 01:05:54.300 +to grab the right columns and things off the data set and get to the ground running without + +01:05:54.660 --> 01:05:56.540 +you having to procure those things as well. + +01:05:56.740 --> 01:05:59.440 +So it's a pretty fun environment to create these things. + +01:05:59.740 --> 01:06:01.520 +It's a weird time, isn't it, that we live in? + +01:06:01.840 --> 01:06:04.420 +Kind of magical and also scary, but just weird. + +01:06:04.960 --> 01:06:05.520 +Yeah, definitely. + +01:06:05.960 --> 01:06:07.460 +But very much fun as well. + +01:06:08.080 --> 01:06:09.960 +I would throw out there, if people are interested, + +01:06:10.440 --> 01:06:13.960 +check out the Matt Makai episode I did on HHC programming + +01:06:14.340 --> 01:06:15.300 +three or four episodes ago. + +01:06:15.720 --> 01:06:16.500 +And choose a good model. + +01:06:16.840 --> 01:06:18.220 +I find people are like, I tried this. + +01:06:18.240 --> 01:06:19.040 +It did a bunch of junk. + +01:06:19.200 --> 01:06:20.840 +It's like, yeah, but use the free tier, right? + +01:06:20.940 --> 01:06:21.260 +I see. + +01:06:22.740 --> 01:06:24.820 +There's a big difference between the top tier models + +01:06:25.020 --> 01:06:26.480 +and the cheaper ones. + +01:06:26.620 --> 01:06:31.360 +So, all right. Where are we going? What's the roadmap look like for anywidget? + +01:06:31.600 --> 01:06:45.980 +Yeah. So the roadmap for anywidget, I think looks a lot like trying to iron, like trying to get users to, or authors to understand some of these patterns that are like emerging in the ecosystem for like building some of these, maybe more like high performance, like visualization tools. + +01:06:46.960 --> 01:06:51.520 +In terms of the library itself, there may be a few features that we want to add for users. + +01:06:51.640 --> 01:07:01.380 +But I think one of the most important things is that we just ensure that we stay backwards compatible now that we have something that people are building around. + +01:07:01.500 --> 01:07:11.920 +And so my call to action is just to try to get more folks, if they are curious, to try out building new widgets and let's keep this ecosystem pretty healthy. + +01:07:12.420 --> 01:07:25.940 +And then on top of that, if there are places where folks are like running into limitations of like the specification as is, then getting like invested parties either from, you know, the implementers to try to understand like what APIs we need to add to support things in the future. + +01:07:26.280 --> 01:07:26.800 +So, yeah. + +01:07:27.120 --> 01:07:27.460 +Amazing. + +01:07:28.520 --> 01:07:32.440 +It was up my mind, but I forgot to ask it when we were talking about this, where things run. + +01:07:33.180 --> 01:07:40.600 +If this is running in the kernel and you're doing HPC type stuff, I could potentially use Rust or C++ as part of building my widget. + +01:07:40.800 --> 01:07:41.060 +Is that right? + +01:07:41.500 --> 01:08:06.580 +Yeah, definitely. So one pattern we've seen is like, you know, you just try to do as little as you like just the interactive bits that you want in the front end. And then you have the full resources of anything that you can do in Python on the on the back end side, right. And because Python is such a language that has sort of grown to be able to be extended with all these like systems level programming languages, there's like a lot that you can tap into there in terms of like, yeah, high performance compute or like. + +01:08:07.580 --> 01:08:23.220 +So my hope is that really we push user or push anywidget authors of creating like interesting interactive elements that then allow us to really like have that extra input into these like really scalable like ecosystem, data ecosystem that we have on the Python side. + +01:08:23.520 --> 01:08:23.700 +Awesome. + +01:08:24.380 --> 01:08:25.020 +PR is accepted? + +01:08:25.580 --> 01:08:25.980 +Absolutely. + +01:08:26.240 --> 01:08:26.319 +Yeah. + +01:08:26.880 --> 01:08:27.020 +Yeah. + +01:08:27.120 --> 01:08:27.259 +Yeah. + +01:08:27.560 --> 01:08:27.720 +Okay. + +01:08:28.180 --> 01:08:30.500 +So yeah, people can check it out here on GitHub. + +01:08:30.620 --> 01:08:31.440 +Of course, we'll link to that. + +01:08:31.600 --> 01:08:33.080 +And yeah, you got ideas. + +01:08:33.600 --> 01:08:33.819 +All right. + +01:08:33.920 --> 01:08:35.040 +Get in there and add it. + +01:08:35.259 --> 01:08:35.400 +Yeah. + +01:08:35.660 --> 01:08:36.500 +And we also have a Discord. + +01:08:36.900 --> 01:08:40.080 +So folks, like you open up a notebook + +01:08:40.290 --> 01:08:42.380 +and you run into something or you just, you have an idea, + +01:08:42.660 --> 01:08:44.560 +like there's plenty of folks or I'm in there. + +01:08:45.860 --> 01:08:47.859 +And I love to try to get people + +01:08:47.980 --> 01:08:50.140 +for their first bit of JavaScript ever. + +01:08:50.270 --> 01:08:52.220 +I think it's pretty fun when someone has an idea + +01:08:52.589 --> 01:08:54.299 +and you just, we help them get there. + +01:08:54.450 --> 01:08:55.819 +So like, but it's JavaScript. + +01:08:56.009 --> 01:08:57.020 +Like, no, no, you're going to be okay. + +01:08:57.180 --> 01:08:57.740 +It's not that much. + +01:08:57.900 --> 01:08:59.220 +It's just enough JavaScript, right? + +01:08:59.339 --> 01:08:59.700 +Exactly. + +01:09:00.319 --> 01:09:01.140 +Whatever that means to you. + +01:09:02.140 --> 01:09:02.500 +Exactly. + +01:09:03.319 --> 01:09:04.980 +Just enough is so that my thing works. + +01:09:05.400 --> 01:09:05.640 +All right. + +01:09:05.980 --> 01:09:07.560 +Trevor, thank you so much for being on the show. + +01:09:07.790 --> 01:09:08.580 +I really appreciate it. + +01:09:08.609 --> 01:09:09.680 +And congrats on anywidget. + +01:09:10.660 --> 01:09:11.620 +Looks super cool. + +01:09:12.060 --> 01:09:13.700 +It definitely looks like it's all of any need + +01:09:13.779 --> 01:09:14.700 +and easy to work with. + +01:09:14.920 --> 01:09:16.180 +Yeah, thanks so much for having me. + +01:09:16.480 --> 01:09:16.920 +Yeah, you bet. + +01:09:17.190 --> 01:09:17.480 +See you later. + +01:09:17.819 --> 01:09:18.140 +See ya. + +01:09:19.600 --> 01:09:21.720 +This has been another episode of Talk Python To Me. + +01:09:22.100 --> 01:09:22.819 +Thank you to our sponsors. + +01:09:23.069 --> 01:09:24.319 +Be sure to check out what they're offering. + +01:09:24.529 --> 01:09:25.880 +It really helps support the show. + +01:09:26.540 --> 01:09:27.980 +Look into the future and see bugs + +01:09:28.190 --> 01:09:29.339 +before they make it to production. + +01:09:30.360 --> 01:09:32.000 +Sentry's Seer AI code review + +01:09:32.500 --> 01:09:34.859 +uses historical error and performance information + +01:09:34.880 --> 01:09:41.299 +at Sentry to find and flag bugs in your PRs before you even start to review them. Stop bugs before + +01:09:41.350 --> 01:09:47.620 +they enter your code base. Get started at talkpython.fm/seer-code-review. And this + +01:09:47.710 --> 01:09:53.400 +episode is sponsored by JetBrains and the PyCharm team. This week only through December 19th, 2025, + +01:09:54.100 --> 01:10:00.440 +get 30% off of PyCharm Pro, including renewals, and PyCharm will donate all the proceeds to the + +01:10:00.400 --> 01:10:06.460 +Python Software Foundation. Support the PSF by getting or renewing PyCharm. Visit talkpython.fm + +01:10:06.600 --> 01:10:13.160 +slash pycharm dash PSF dash 2025 and use the code strongerpython. Both of these are in your podcast + +01:10:13.280 --> 01:10:18.520 +player show notes. If you or your team needs to learn Python, we have over 270 hours of beginner + +01:10:18.820 --> 01:10:24.260 +and advanced courses on topics ranging from complete beginners to async code, Flask, Django, + +01:10:24.580 --> 01:10:24.800 +HTML, HTML, HTML, HTML, HTML, HTML, HTML, HTML, HTML, HTML, HTML, HTML, HTML, HTML, HTML, HTML. + +01:10:31.060 --> 01:10:35.240 +And if you're not already subscribed to the show on your favorite podcast player, + +01:10:35.850 --> 01:10:36.540 +what are you waiting for? + +01:11:02.440 --> 01:11:05.240 +I'm out. + diff --git a/transcripts/531-talk-python-in-prod.txt b/transcripts/531-talk-python-in-prod.txt new file mode 100644 index 0000000..e43070e --- /dev/null +++ b/transcripts/531-talk-python-in-prod.txt @@ -0,0 +1,2600 @@ +00:00:00 Have you ever thought about getting your small product into production, + +00:00:02 but are worried about the cost of the big cloud providers? + +00:00:05 Or maybe you think your current cloud service is over architected + +00:00:08 and costing you too much? + +00:00:10 Well, in this episode, we interview Michael Kennedy, author of Talk Python + +00:00:14 in Production, a new book that guides you through deploying web apps + +00:00:17 at scale with right sized engineering. + +00:00:20 This is Talk Python To Me, episode 531, recorded November 26, 2025. + +00:00:44 Welcome to Talk Python To Me, a weekly podcast on Python. + +00:00:48 This is your guest host, Christopher Trudeau. + +00:00:50 Follow me on bluesky, where I'm trudeau.dev. + +00:00:53 You can follow the podcast or this week's guest on Mastodon, @talkpython for the show + +00:00:59 and @mkennedy for the guest, both on fosstodon.org. + +00:01:03 And keep up with the show and listen to over nine years of episodes at talkpython.fm. + +00:01:09 If you want to be part of our live episodes, you can find the live streams over on YouTube, + +00:01:14 subscribe to our YouTube channel at talkpython.fm/youtube and get notified about upcoming shows. + +00:01:21 Look into the future and see bugs before they make it to production. + +00:01:25 Sentry's Seer AI code review uses historical error and performance information at Sentry + +00:01:30 to find and flag bugs in your PRs before you even start to review them. + +00:01:35 Stop bugs before they enter your code base. + +00:01:37 Get started at talkpython.fm/seer-code-review. + +00:01:42 And it's brought to you by Agency. + +00:01:44 Discover agentic AI with Agency. + +00:01:47 Their layer lets agents find, connect, and work together. + +00:01:50 Any stack, anywhere. + +00:01:51 Start building the internet of agents at talkpython.fm/agency spelled A-G-N-T-C-Y. + +00:01:58 Michael, welcome to Talk Python To Me. + +00:02:00 You know, I looked it up. + +00:02:01 You've been on the show more than anyone else. + +00:02:03 But in case there's new listeners, tell us a bit about yourself. + +00:02:06 Incredible. + +00:02:07 Good to be here with you, Christopher. + +00:02:09 A bit of a turn of the tables, I would say. + +00:02:13 And it's, you know, long time listeners, I'm sure they know all the little details because + +00:02:18 I work them in here and there. + +00:02:20 I think that's kind of fun to just make things personal. + +00:02:22 But I've also said this on the show, and I'm sure it's a surprising fact that you know as well, + +00:02:27 but over half of the people in the Python space have only been here for two years. + +00:02:31 Yeah, I keep seeing that stat. + +00:02:33 Yep. + +00:02:34 That's crazy, right? + +00:02:34 So even if people, you know, I told this story about my background five years ago, + +00:02:39 like those people weren't here, like half of them. + +00:02:41 So crazy, crazy stuff. + +00:02:43 All right, so my backstory, I was, I thought I would be a mathematician. + +00:02:48 I studied math. + +00:02:49 stuff like that in college, was working on my PhD, started doing a bunch of work with silicon graphics, + +00:02:57 mainframe, supercomputer type stuff so that I could do my math research. + +00:03:00 And I realized, wow, this programming stuff is way more fun than math. + +00:03:04 How do I change gears? + +00:03:05 And so that was like 1998, 99. + +00:03:09 Haven't looked back. + +00:03:10 I've been programming since then and super fun, a couple of languages and around 10, 11 years ago, + +00:03:16 started Talk Python, year after that, quit my job. + +00:03:19 Made Talk Python my full-time job. + +00:03:22 Started offering courses as well. + +00:03:23 That's something that people don't necessarily know. + +00:03:26 That sometimes they'll ask, well, what do you do for your job, actually? + +00:03:28 I'm like, well, we're doing it. + +00:03:31 So anyway, that's me. + +00:03:34 A super, super fan of Python. + +00:03:36 Super fan of programming. + +00:03:38 Every day I wake up just like, wow, how awesome is today? + +00:03:40 Look at all the new stuff that came out that we learned that we can do. + +00:03:43 Like new libraries, AI stuff these days. + +00:03:47 Yeah, there's always plenty to talk about. + +00:03:49 It's incredible times. + +00:03:50 It's incredible times. + +00:03:51 And you've added a new trophy to the mantle, I guess. + +00:03:57 You've written a book. + +00:03:58 I have written a book. + +00:04:00 You know, that's a little bit, I'll put it this way. + +00:04:02 It's not something I ever saw myself doing, but I'm really excited that I did. + +00:04:07 And yeah, it took, I spent a couple of months properly writing it. + +00:04:12 You know, I really put in my energy in and like all projects, you think you're about done. + +00:04:18 Yeah, that first 80% is nothing like the last 80%. + +00:04:22 No, and the last 5% is long. + +00:04:25 You know, and it's not just the book. + +00:04:28 It's like, okay, well, where am I going to sell it? + +00:04:30 Okay, well, Amazon, and then I'll self-publish, or do I use a publisher? + +00:04:34 You end up self-publishing it, but then you're like, how do I? + +00:04:37 You know, all these things you learn for the first time. + +00:04:39 Like, how do I get it into Amazon to sell even? + +00:04:42 And there's a bunch of decisions. + +00:04:44 I can tell you, even with having taken the publishing route, it's no easier. + +00:04:48 It's just that it goes dark for two months and then all of a sudden it's like, you need + +00:04:52 to do this by yesterday. + +00:04:53 So yeah, it's not necessarily an advantage either way, I think. + +00:04:57 Yeah. + +00:04:57 Yeah. + +00:04:58 Yeah. + +00:04:58 I was really on the fence and I thought, look, let me just try this myself. + +00:05:03 I got a few podcast listeners. + +00:05:04 I can let them know about it. + +00:05:06 An audience helps. + +00:05:08 I honestly think it was probably the right choice for me. + +00:05:11 And for those who haven't come across it yet, do you want to give us the one paragraph version? + +00:05:17 Yeah. + +00:05:17 So the book is called Talk Python in Production. + +00:05:20 There are other books that are, I'm pretty sure one is called Python in Production + +00:05:24 or other things about how do you get your Python app into production. + +00:05:27 But this is Talk Python in Production because it's sort of a dual role storytelling vehicle. + +00:05:33 Obviously it's nonfiction, it's a technical book. + +00:05:35 But the idea was, let me tell not a generic story of how you might run some of your Python code in production, + +00:05:42 mostly APIs, web apps, databases, like that kind of stuff, right? + +00:05:46 not a general story of that but my story right i i've been on this journey of not complete noob + +00:05:55 but pretty significantly lost getting my original python app out into the world to pretty confident + +00:06:03 running stuff in i think a simpler than usual way in a good way right i one of the things i really + +00:06:11 liked about the book is it's not quite changing gears but you you do a nice mix of sort of the + +00:06:19 decision making process versus the here's exactly what i did um and so you get a little bit of both + +00:06:25 and and honestly the decision making process is something i find often isn't there in a lot of + +00:06:30 work uh you know your your standard blog post is always well and then and then add exactly this to + +00:06:36 exactly this file um but i think i really really sort of enjoyed the and and this is what i tried + +00:06:41 and this is this is why i changed and you're very kind of humble about it like it like a lot of + +00:06:47 folks who write this kind of content it's thou shalt do this and you're like this is the way + +00:06:51 i've told you yeah yeah this worked for me and uh yeah so so i really like the fact that you've + +00:06:57 you've kind of blended that in what what made you decide to do this like that what there so you said + +00:07:03 you didn't really have the itch to go down the path so uh what i'm not sure i didn't have the + +00:07:08 itch i just didn't think that it was something i was capable of ah i see okay you know what i mean + +00:07:14 not not that i didn't think if i literally took two years of my life and went into like a cabin + +00:07:20 the rose style or something i could come out with a book i'm pretty sure but given all the + +00:07:25 constraints of like i have a family and i gotta keep top python running like in that sense i didn't + +00:07:30 I didn't think I would be able to do it, but. + +00:07:32 - Yeah, it's perseverance more than anything else, I think. + +00:07:35 Yeah, yeah, for sure. + +00:07:36 - Exactly. + +00:07:38 So, yeah, go ahead. + +00:07:39 - Sorry, go ahead. + +00:07:39 No, no, go ahead. + +00:07:40 - You know, so why did I write it? + +00:07:43 Two reasons. + +00:07:44 One, I think it's an interesting story, and I thought people would enjoy hearing it, + +00:07:48 like the personal side that you mentioned a little bit. + +00:07:50 I thought people would appreciate that. + +00:07:52 And maybe more significantly, I feel like a lot of the advice out there + +00:07:58 in the tech space in general, but for now we're focused on like, + +00:08:02 how do I deploy my app sort of like Python plus DevOps + +00:08:05 type of thing. + +00:08:05 But I think a lot of the advice out there is a little bit of a flex in the sense that, + +00:08:12 oh, look at this, we're using these seven services and then this technology, + +00:08:17 and then we're auto scaling it with that. + +00:08:18 And then we have these logs pumped over to this other thing. + +00:08:21 You're like, whoa, okay, that's kind of cool. + +00:08:23 But a lot of people who just have an app they just want to go from like it works on my machine to look what I made they see that and go + +00:08:32 I can't do that you know what it's not for me it's just like I can't spend $500 on this + +00:08:38 infrastructure and I don't feel worthy if I don't have all you know like completely geo distributed + +00:08:46 redundant databases and like you don't need that you know what I mean and people keep asking me + +00:08:50 like hey Michael can you give me some advice I'm like well not that and finally I'm like let me + +00:08:55 just tell the story you know and so that was a big motivation you see it in industry a lot + +00:09:01 it's sometimes referred to as you know resume based architecture right like it's a do we need + +00:09:06 this or is this because i'm trying to learn it um and i think there's always that oh some of it's + +00:09:12 aspirational right i we will be netflix and so you know we need to be on every continent and all + +00:09:19 the rest of it and uh right right it's it's very aspirational it's like i'm going to build this app + +00:09:23 And the reason I'm building it is it's going to take off. + +00:09:26 And that day when the hockey stick hits, I'm ready. + +00:09:30 Yeah. + +00:09:30 You know what I mean? + +00:09:31 There's also a, you know, I think there's a lack of recognition of the cost of both in + +00:09:40 energy and money, energy as in human effort. + +00:09:42 I'm not talking about electricity, of the next 1%, right? + +00:09:47 So like getting from 90% uptime to 95% uptime costs something. + +00:09:52 getting from 95 to 96 costs more than that and getting from 96 like and once you're getting into + +00:09:58 like the four nines five nines thing and then you know cloud flare goes down and you're all screwed + +00:10:03 anyways right so it's so it's so ironic it is 100 ironic that you take all these steps and you employ + +00:10:11 all these services and it's the complexity of those services that went down like yeah you know + +00:10:16 this show will come out in a couple of weeks, but we're just on the eve of basically three weeks + +00:10:22 of important things going down. + +00:10:24 First AWS, then Azure, and then GitHub. + +00:10:28 And also, and then Cloudflare, so let's put that as four, + +00:10:30 within three weeks, right? + +00:10:32 And the AWS one was like, the reason it went down is DynamoDB had some sort of DNS problem. + +00:10:40 Even if you're not using that, the thing you're using, + +00:10:42 Like Lambda depends upon DynamoDB for itself to work. + +00:10:46 So it was just like a cascade of kabang, right? + +00:10:49 And that's a little bit of this complexity. + +00:10:51 Like the more complexity you buy in, even if it's not yours, it is yours in a sense. + +00:10:56 Yeah, yeah. + +00:10:56 And there's always humans involved, right? + +00:10:58 So there's always fallibility somewhere, right? + +00:11:03 Although one of the arguments I have seen recently + +00:11:05 in response to the Cloudflare outages, the good news is if you're, you know, + +00:11:10 I saw some articles that were like, well, you shouldn't be dependent on Cloudflare. + +00:11:13 And I saw the counter articles were basically, you know what, when half the internet's down, + +00:11:17 no one's hassling you that your app is down because half the internet's down. + +00:11:21 So there is an excuse when it isn't your fault. + +00:11:25 So yeah, anyways. + +00:11:26 That is true. + +00:11:27 And you don't see what Cloudflare saved people. + +00:11:31 Yes. + +00:11:31 Right? + +00:11:32 I'm not using Cloudflare. + +00:11:34 I actually use bunny.net. + +00:11:35 But CDNs make it possible for your app to survive these spikes in ways + +00:11:40 that they very well may not with without and certainly the ddos type of stuff that they protect + +00:11:45 against well and i use it simply for certificates like google decided everyone shall be https even + +00:11:52 my sites that don't need it and rather than try to figure out automation for let's encrypt has + +00:11:58 gotten a lot better but when i first started it it was like and i need this and i need this and i + +00:12:03 need this and then the cron job could go down or it's like or i can stick cloud fire in front of it + +00:12:07 and I never have to think about it ever again, right? + +00:12:09 So yeah, there's a little bit of value that way. + +00:12:12 Yeah, there definitely, definitely is. + +00:12:14 Another thing I want to kind of bring back a little bit + +00:12:16 is that you opened this segment on. + +00:12:19 You said, like I shared the human side of the story in kind of a humble way. + +00:12:24 Like that was certainly something, that was one of the main goals, like I said. + +00:12:27 I think it's just a continuation of the podcast, right? + +00:12:30 I started the podcast 10 years ago and I'm like, when I got into Python, there were no Python podcasts. + +00:12:36 there had been but there were none at the time and i'm like there's all these cool libraries i want + +00:12:40 to hear the stories and the humanity and you go to the documentation and you're like cool + +00:12:45 technology sterile as can be and the reason technology is interesting is not just because + +00:12:52 there's an api i can call this but it's like it's a journey and it's a story and it's so i just + +00:12:58 wanted to do that again in the book it's all problem solving right and and there has to have + +00:13:02 have been a problem for someone to want to solve it, which means there's going to have been people + +00:13:06 involved in trying to figure out what that is. Yeah. I think a lot of people maybe, I don't know, + +00:13:10 I shouldn't speak for people. It seems to me though, like a lot of people, they look at a + +00:13:15 technology and they think, they just assess it as a dry, sterile sort of thing on its own. That was + +00:13:22 created in a context, right? Why was celery created? Not just so I can send events to it and + +00:13:30 like, you know, add more complexity and asynchronous, it solved a real problem. + +00:13:35 And if you, if you hear and you understand and you, you follow that journey, you're like, + +00:13:39 I see this is where this come from and why it exists. + +00:13:42 Then you can decide, is it for me? + +00:13:45 Right. + +00:13:45 Well, and, and I think doubly so in the, in the open source space, because like this is + +00:13:51 all volunteer work. + +00:13:52 And so knowing a little bit about who's doing what and, you know, humanizing that a little + +00:13:58 bit. + +00:13:59 Right. + +00:13:59 Their motivation. + +00:14:01 Yeah. + +00:14:01 And it's also easier to be grateful, right? + +00:14:03 Like this isn't some soulless corporate machine. + +00:14:05 There was a reason behind this and a driver behind it. + +00:14:09 This portion of Talk Python To Me is brought to you by Sentry. + +00:14:13 Let me ask you a question. + +00:14:15 What if you could see into the future? + +00:14:17 We're talking about Sentry, of course. + +00:14:19 So that means seeing potential errors, crashes, and bugs before they happen. + +00:14:24 Before you even accept them into your code base. + +00:14:26 That's what Sentry's AI Sears code review offers. + +00:14:30 You get error prediction based on real production history. + +00:14:34 AI Seer Code Review flags the most impactful errors your PR is likely to introduce before merge using your app's error and performance context, not just generic LLM pattern matching. + +00:14:46 Seer will then jump in on new PRs with feedback and warning if it finds any potential issues. + +00:14:52 Here's a real example. + +00:14:53 On a new PR related to a search feature in a web app, we see a comment from seer by sentry bot in the PR. + +00:15:02 And it says, potential bug, the process search results function can enter an infinite recursion when a search query finds no matches. + +00:15:10 As the recursive call lacks a return statement and a proper termination condition. + +00:15:15 And Seer AI code review also provides additional details which you can expand for further information on the issue and suggested fixes. + +00:15:23 And bam, just like that, Sear AI Code Review has stopped a bug in its tracks without any + +00:15:29 devs in the loop. + +00:15:30 A nasty infinite loop bug never made it into production. + +00:15:33 Here's how you set it up. + +00:15:34 You enable the GitHub Sentry integration on your Sentry account, enable Sear AI on your + +00:15:40 Sentry account, and on GitHub, you install the Sear by Sentry app and connect it to your + +00:15:45 repositories that you want it to validate. + +00:15:47 So jump over to Sentry and set up Code Review for yourself. + +00:15:50 Just visit talkpython.fm/seer-code-review. + +00:15:54 The link is in your podcast player show notes and on the episode page. + +00:15:58 Thank you to Sentry for supporting Talk Python and me. + +00:16:02 Inside the book, you've added a couple of things that are a little sort of non-standard, + +00:16:08 like the audio reader briefs and the galleries. + +00:16:11 You want to give a quick rundown? + +00:16:13 And speaking of motivation, what motivated you to include those things? + +00:16:18 - So let me describe what they are first, 'cause they are weird, + +00:16:21 but they're weird in a good way, I think. + +00:16:23 So if you go to the book, my vision was somebody's gonna be reading this, + +00:16:29 very likely on a Kindle, and if I go and put really nice diagrams, pictures, whatever, + +00:16:36 how good is that gonna look in a Kindle paper white, black and white, you know what I mean? + +00:16:40 Like how hard is that going to be to read? + +00:16:42 I think it's gonna be hard is what I decided. + +00:16:44 And so what I ended up doing is I said, okay, How can I make it better for people so that when they want to work with code, it's not trapped inside your Kindle or your iPad, Apple Books or wherever you read it, but it's super accessible, right? + +00:17:00 So what I did is I created some things I called galleries, and there's a code gallery, a figure gallery, and a links gallery. + +00:17:07 And these are just like, they're kind of like an index of those things. + +00:17:11 So like the links one just says, hey, here's all the URLs that we talked about in chapter 10 or chapter 11. + +00:17:17 and just the sentence that contains them. + +00:17:19 So instead of trying to go back through and flipping through the book, + +00:17:22 like, where was that thing they talked about, right? + +00:17:24 Like, no, you just go to the gallery and you click on the chapter + +00:17:27 or you just do command F. + +00:17:29 There it is, you know what I mean? + +00:17:30 And also for, especially for the figures, like it has like 2,000 or 4,000 + +00:17:37 by whatever level pictures that you're not even allowed to put into + +00:17:42 like a Kindle book. + +00:17:42 They're like, no, we're going to redo those, rescale those images for you + +00:17:46 down to something fuzzy, right? + +00:17:47 So if you want to read like little tiny texts, I put it there. + +00:17:51 So that's the galleries. + +00:17:51 And I was just maybe a little bit more backstory here is when I wrote this, + +00:17:56 I've worked with other type of editing things, any tools. + +00:17:59 I'm just like, I need to write this and I need to get this done in a super fluid + +00:18:04 way. + +00:18:05 So I'm just going to write in Markdown. + +00:18:07 Right. + +00:18:07 Just writing in Markdown. + +00:18:09 And so what I did is I, of course there's book publishing things that you can put + +00:18:13 Markdown into and so on. + +00:18:14 But I'm like, I'm just going to write one markdown file per chapter and then write some Python program to assemble that in interesting ways. + +00:18:23 Right. + +00:18:23 And then turn that into an EPUB or PDF or Kindle or whatever through something called Pandoc. + +00:18:29 Are you familiar with Pandoc? + +00:18:30 I've heard of it. + +00:18:31 Yeah. + +00:18:31 If you go and look, for people who don't know what Pandoc is, if you go look at Pandoc, it has right on the web page, it has this like fuzzy thing on the right. + +00:18:40 It's like gray fuzzy. + +00:18:41 You know, what is that? + +00:18:43 this thing on the right shows you all the formats that go in and all the formats that could come out + +00:18:49 and it's it's insane like you can't even the line is just the lines connecting the graphs of these + +00:18:55 things it's just a black blob like i could put a haddock uh a haddock document whatever that is + +00:19:02 it convert that to a doc book four right i mean it's insane okay so what i did is i built this + +00:19:07 Python, simple Python thing that reassembles Markdown in certain ways and then feed the + +00:19:13 final thing that it built into Pandoc to get the different ebook formats. + +00:19:16 Right, right. + +00:19:17 Okay. + +00:19:17 But then it occurred to me, like, so I didn't start out with these gallery type things or + +00:19:22 other stuff, but I'm like, well, this is just Python against Markdown. + +00:19:27 Surely I can start pulling out all the links and pulling out the images and then writing + +00:19:30 them somewhere else and then just committing that to GitHub. + +00:19:33 So once, you know, it's kind of just the standard story of Python or programming in general, + +00:19:38 but I think it's extra common in Python. + +00:19:40 It's like, I started solving the problem with Python. + +00:19:42 And once that was in place, it's like the next block and the next thing is just like, + +00:19:47 that's easy now. + +00:19:47 And that's easy. + +00:19:48 And these three things are also easy. + +00:19:50 Let's just do that and just keep adding to it. + +00:19:52 So that's where they came from is one, wanting to make sure people had a good experience + +00:19:57 with like code resources, pictures, and so on. + +00:20:00 But also it's just kind of following the lead of like, hey, let's just keep going. + +00:20:04 Well, and it's one of the beauties of an ebook. + +00:20:08 If dead tree copies, those things cost money. + +00:20:11 And so it's like, oh, I've got a great idea for six more appendices. + +00:20:15 And that's when you start going, oh, wait a second. + +00:20:18 I'm not going to add 300 pages to a 200 page book. + +00:20:21 Yeah, exactly. + +00:20:22 With an ebook, you can go, oh, yeah, here, we can make this referenceable in a couple + +00:20:28 different ways. + +00:20:29 Right. + +00:20:29 Yeah. + +00:20:29 It's like it duplicates the images into, you know, maybe 20 more pages or something, but it's an ebook. + +00:20:35 Who cares? + +00:20:36 Exactly. + +00:20:36 Yeah. + +00:20:36 Yeah. + +00:20:37 So, you know, over the history of the show, I think I've become familiar with your AI journey. + +00:20:45 And recently, it sounds like you've bought. + +00:20:48 It's your fairly big proponent. + +00:20:51 That being said, there's still a Made by Humans logo inside. + +00:20:55 Yeah. + +00:20:57 So I'm going to put you on the spot. + +00:21:01 Do you believe in it or not? + +00:21:04 Do I believe in it or not? + +00:21:06 So why made by humans? + +00:21:09 Yeah, it's a really good question. + +00:21:11 So I think there's a weariness of content generated by AI or assisted by AI meant to attract attention and build authoritative feelings about a thing. + +00:21:26 when that authority or that skill set is not necessarily yours. + +00:21:31 And that I still very much do not like. + +00:21:34 If I wanted to create a blog, I mean, I guess I could do it. + +00:21:37 It'd be a fun experiment, I guess, in sort of an evil way. + +00:21:40 Like, what if I just go create a completely random blog + +00:21:43 and I just have chat just bust out an article twice a day, every day, + +00:21:48 of the thing that's on the top of Hacker News or something? + +00:21:50 You know, just like, you could do that. + +00:21:52 And actually, it might even be interesting. + +00:21:54 I don't really even know. + +00:21:56 but I don't want it. I don't want that. Right. And for this, I wanted to share my story, + +00:22:01 not have AI create a book that has bullet points of my story. Right. Right. Yeah. So for me, + +00:22:07 it was important to like write this. I wrote it a hundred percent by Michael, right. It took me + +00:22:13 a lot of work. People's, I know like it got posted on like Reddit and I think hacker news somewhere. + +00:22:19 There was a bunch of comp and they're like, Oh, this thing is definitely just AI generated. I'm + +00:22:25 felt not AI generated to me. If it makes you feel any better, I've had actually comments pop up on + +00:22:32 some of my video courses claiming that my voiceover was AI. So that's just the world we live in now. + +00:22:39 It's the world we live in and there's not a lot you can do with it. So just kind of put a little + +00:22:43 bit of a pushback against that. I did put like a prefix sort of thing and a label that says + +00:22:50 made by humans. And you know, what's really funny is I don't know if I can actually find that section. + +00:22:55 I don't think I can on just the web part, but I made a picture. + +00:22:59 Maybe I did. + +00:23:00 Humans? + +00:23:01 No. + +00:23:01 Anyway, I made a picture that I drew. + +00:23:03 I literally, I'm a big fan of Pixelmator Pro. + +00:23:07 I went into Pixelmator Pro and I drew it. + +00:23:09 And they said, proof that this is AI generated. + +00:23:12 Look at that stupid made by humans graphic. + +00:23:14 It's clearly AI generated. + +00:23:15 It would be way better if it wasn't generative. + +00:23:17 Yes. + +00:23:19 Okay. + +00:23:20 So how do I square that with me actually being quite a fan of AI stuff these days? + +00:23:24 Like I'm, let's do like a looking back and then looking forward. + +00:23:28 So let's go back 30 years. + +00:23:30 I'm also a fan of using editors that when I hit dot, tell me what I can do with that function, class, variable, et cetera. + +00:23:38 So I'm not constantly in the documentation, right? + +00:23:42 Does that make me not a programmer? + +00:23:44 I don't think so. + +00:23:45 I'm still writing code. + +00:23:46 I'm still thinking architecture. + +00:23:47 I'm just not in the documentation constantly. + +00:23:50 And honestly, I maybe don't memorize every little function and field of everything in the standard library, right? + +00:23:57 It's fine. + +00:23:57 That's not where our time is best spent. + +00:23:59 And I feel that way about AI programming. + +00:24:01 I think there's a lot of, there are pitfalls and there are worrisome aspects of it. + +00:24:06 But you can use some of these agentic AI tools these days to think in so much higher level building blocks. + +00:24:13 Think of like, I'm working in a function and I'm writing this part. + +00:24:16 or I'm working in design patterns. + +00:24:19 And I can think of these big sort of concepts. + +00:24:22 Well, with this AI stuff, you can just say, what if we had a login page? + +00:24:25 Oh, we have a login page. + +00:24:27 Now, what other building block do I need? + +00:24:29 Like the building blocks are so big and sort of non-critical software, + +00:24:35 non-super important software becomes so much cheaper than before. + +00:24:40 You're like, I wish I had a little utility or app that would do this, + +00:24:43 but it just definitely doesn't justify two weeks of work to have it. Like, what if it was a couple of prompts and half an hour? Like, + +00:24:50 yeah, well then I'll have it. You know what I mean? And you can, that is transformative + +00:24:55 for how we work. Yeah. So much of coding is boilerplate, right? So if we can figure out how + +00:24:59 to make that easier, then why not? Right. And I haven't got there with it myself. I don't know + +00:25:05 whether I will. I'm definitely a little more suspicious of it than you are, but I copy and + +00:25:11 paste code all the time. + +00:25:12 And it's not like I'm like, oh, I have to hand tune that. + +00:25:16 No, it's like, well, I got to copy something that does that. + +00:25:18 - Yeah, let me give you a concrete example because I think it's easy to talk in generalizations + +00:25:24 and people are like, well, that's not for me, a bunch of AI slop, which is fair. + +00:25:28 But I'll give you an example of one thing I'm like, this was just such a nuisance and I'm gonna fix it. + +00:25:33 So when I first built Talk Python, the website 10 years ago, + +00:25:37 Bootstrap was all the rage, not modern Bootstrap, like old Bootstrap, right? + +00:25:41 Which they've completely redone the way that you have to structure your HTML and CSS + +00:25:48 completely, incompatibly, several times since then. + +00:25:51 And until very recently, until this summer, every time I wanted to add a feature or an aspect, + +00:25:56 like for example, this whole section that hosts the book, + +00:25:59 I wanna add that, well, you know what I had to do? + +00:26:01 I had to go write like 10-year-old Bootstrap. + +00:26:03 And I'm like, I hate it so much. + +00:26:04 There's so much nicer tools I could be doing this. + +00:26:07 but there's 8,000 lines of HTML and almost as many CSS. + +00:26:12 Does it justify me rewriting that much UI so that I can feel a little bit better + +00:26:18 and use a little bit nicer UI tooling? + +00:26:20 No, it definitely doesn't. + +00:26:22 And so for a couple of years, I'm like, oh, I wish I could do something else, + +00:26:27 but it's not worth it. + +00:26:28 And then I was sitting on my porch, little back area with my computer in the summer, + +00:26:31 hanging out, I'm like, you know what? + +00:26:32 I bet Claude could do it. + +00:26:34 Hey, Claude, rewrite all of this site. + +00:26:37 make a plan and rewrite it, move it from Bootstrap 3 to Bulma, which is like simple tailwind, + +00:26:43 and just completely do it. Four hours later, the whole thing was redone, like 20,000 lines of edits. + +00:26:49 Wow. + +00:26:49 Done. And it wasn't perfect. I had to go, you messed up a little bit here. And actually, + +00:26:53 that was right, but that doesn't look good anymore. So could you actually just make it look, + +00:26:57 you know what I mean? But I mean, it was like half a day. That work was done. + +00:27:00 Right. + +00:27:00 And that is a different level. + +00:27:04 No, it would have been weeks to a month. + +00:27:07 And it's the worst kind. + +00:27:08 It's like, okay, here's how the grid system used to work. + +00:27:11 Let me restructure the HTML. + +00:27:12 Oh, you lost a div? + +00:27:14 Whoopsie. + +00:27:15 Now how are you going to untangle this? + +00:27:16 You know what I mean? + +00:27:16 Like really, really not good stuff. + +00:27:19 And you can just turn these tools on it. + +00:27:20 And I'm like, you know, love it or hate it. + +00:27:22 That is a skill and a power and a tool that is unlike things we've had before. + +00:27:27 And so when I started having some of those kinds of experiences, I'm like, all right, + +00:27:31 I need to pay attention to this. + +00:27:32 I honestly think a lot of these AIs and LLMs, they're kind of copyright theft and other types of things. + +00:27:39 And there's the environmental aspect and all that. + +00:27:41 But the thing is out of the box. + +00:27:44 Pandora is on the loose, right? + +00:27:46 So given that you're putting your head in the sand, it's not going to make it go away. + +00:27:51 Should you use it or not? + +00:27:52 It's a very powerful tool. + +00:27:53 And so that is what I'm excited about, but I'm not excited about when I go to YouTube + +00:27:58 and I see a video and you can just tell that it's a voiceover plus some general, + +00:28:02 or I go to read a blog and you can tell that it's like, + +00:28:05 they didn't even put enough energy into like, they spent less time writing than I have to read it. + +00:28:10 That's not right. + +00:28:10 There's something going wrong here. + +00:28:12 - Well, as with all tools, we'll figure out what works and what doesn't work. + +00:28:16 Those 8,000 files that you're talking about though, + +00:28:19 those are 8,000 files you have and built over time. + +00:28:25 I suspect- - Lines, by the way, not files. + +00:28:27 - Lines, I'm sorry. + +00:28:28 - I think it's a couple hundred files probably. + +00:28:31 so, but, so that might be something that folks listening aren't really aware of, + +00:28:36 right? + +00:28:37 Like, you're not just the, you know, the podcasts and the courses, but you're + +00:28:42 the guy behind the engineering behind all of it. + +00:28:44 So why, you know, why do that? + +00:28:47 Why not, you know, Squarespace or something along those lines didn't exist when you came + +00:28:51 out, but you get the idea. + +00:28:52 Like what, what, what, why spin it up yourself? + +00:28:55 How did you, how did you get there? + +00:28:56 So it's a good question. + +00:28:58 When I started, there were places I could have gone and hosted the podcast. + +00:29:05 You know, they were very unappealing. + +00:29:08 Not in the sense, like, as a consumer of it, like, they put your show there. + +00:29:13 They were really ugly. + +00:29:15 And they would do things like, next to your show, here's some other shows you might like. + +00:29:18 You're like, no. + +00:29:20 No, I don't. + +00:29:21 I just got people to come to my page. + +00:29:23 You're sending them away. + +00:29:24 Like, don't do that. + +00:29:25 Right. + +00:29:25 But those sites are like Squarespace or whatever, and they're hosting a bunch of them. + +00:29:29 And so they want engagement on their platform broadly. + +00:29:33 They're not for you. + +00:29:34 So initially I thought, well, plus I don't have a ton of experience writing Python code. + +00:29:38 And if I'm going to do the podcast, the more I can get my hands on this stuff, get experience. + +00:29:43 So I just sat down and really in like three days, I wrote the Talk Python website. + +00:29:48 I'm like, I'm doing it this weekend. + +00:29:50 You know what I mean? + +00:29:50 I had a job at the time. + +00:29:51 So I'm like, I got to do it. + +00:29:52 It's a long weekend. + +00:29:53 We're doing it. + +00:29:53 And so I just sat there and cranked it out and really got a really good appreciation for building all the way to the end, you know, like not 60% or following a demo, but like, no, here's a fully working app and all the little nuances. + +00:30:08 But then honestly, that's like the genesis of this, the story that is the book is, well, now how do I get it out there? + +00:30:15 I built it. + +00:30:16 It's fine. + +00:30:17 It works great here locally. + +00:30:19 Now, like, where do I take it? + +00:30:21 Right. + +00:30:21 And a lot of places said, well, you just fire up your Linux terminal and your SSH. + +00:30:24 And I'm like, these words are making me very nervous. + +00:30:27 I need them to not do that. + +00:30:28 I need you to stop saying that. + +00:30:31 Don't forget to swing the chicken over your head. + +00:30:35 Exactly. + +00:30:37 So I actually started in Python Anywhere, even before Anaconda owned them, which was the + +00:30:45 selling point was you go to your browser. + +00:30:47 You, I think you give it a get URL, or maybe, maybe you go into a terminally do a get plot. + +00:30:53 I can't remember how it worked 10 years ago, but it was basically you go to the webpage, + +00:30:57 you type in your domain, you get a terminal, which is basically an SSH connection, but + +00:31:02 in the terminal, and then you give it some commands and then they manage it for you. + +00:31:05 And I'm like, okay, I don't really have to know any Linux. + +00:31:08 I just have to do the two things that says in the terminal. + +00:31:10 And then they keep it running and they, they do the SSH key, the SSH certificates, the DNS, + +00:31:17 that i'm like this i can do this and i got it going there and i was really proud and i ran + +00:31:21 i ran talk python on basically python anywhere and sqlite for like six months first six months + +00:31:27 but then it occurred to me that python anywhere is not really intended to host production level + +00:31:33 applications and it occurred to me when i got an email from them one day again this is pre-anaconda + +00:31:39 and what it said was we're going to be down for four hours as we do some maintenance that's not + +00:31:46 going to be the best look for my podcast which is just now starting to gain some traction and getting + +00:31:50 a lot of people talking on social media and saying hey there's a new podcast you should check + +00:31:54 it out i'm like the four hours are not making me psyched like i understand that things might have + +00:31:59 to reboot they might be down for 30 seconds but hours and hours seems a little like this is not + +00:32:06 really what they intend this for it's like for hobbyists to put up a thing and i probably don't + +00:32:10 belong there anymore and once you shifted why not aws or azure or something like that so + +00:32:16 So I looked around and I went to DigitalOcean. + +00:32:19 So I'd done stuff with both Azure and AWS. + +00:32:23 A little part that I left out about this web journey is I had actually run some pretty big websites + +00:32:30 in both AWS and Azure, but they were Windows-based.NET type things, right? + +00:32:36 So literally GUI sort of configuration hosting them or platform as a service. + +00:32:42 And I don't know, I looked at both of them, especially Azure at the time, + +00:32:46 I'm like, "Well, this is complicated," and like unnecessarily so, + +00:32:49 and I'm afraid I'm trading one level of complexity for another, and also really expensive, + +00:32:55 like no joke expensive. + +00:32:57 The podcast, like in terms of people viewing the pages, + +00:33:00 is nothing insane, I mean, it's certainly popular, but it's nothing like, how are you gonna handle that? + +00:33:05 But the amount of traffic the podcasts and courses do + +00:33:07 in terms of video and MP3s and even XML, Like I think Hawk Python ships about a gigabyte of, + +00:33:15 no, a terabyte of XML. + +00:33:17 Think about a terabyte of XML every month. + +00:33:18 Like it's basically a distributed denial, a DDoS, and a welcome DDoS attack, + +00:33:25 because you'll think how many podcast players are out there + +00:33:27 going, got a new one, got a new one, got a new one, got a new one. + +00:33:30 And each one of those requests is like a meg of XML or more. + +00:33:35 You know, it's like, got a new one, got a new one + +00:33:42 certainly the courses and my bill was well over a thousand bucks and just bandwidth right and then + +00:33:47 I looked at DigitalOcean and I'm like oh you mean it all that bandwidth is free it's included you + +00:33:54 get like terabytes of free traffic with DigitalOcean or Hetzner or some of these smaller ones and I'm + +00:34:01 just like yeah this is better like I don't know what I don't know what this stuff for is here + +00:34:05 where they charge you know a hundred dollars a terabyte yeah just to ship stuff around it's crazy + +00:34:10 I have found, particularly professionally, it's like, oh, you're going to charge me for how many DNS lookups there are. + +00:34:16 That doesn't seem like something I can predict. + +00:34:20 Yes, I know. + +00:34:22 And quite frankly, if it reaches a certain level, I need you to turn it off because something's gone horribly awry. + +00:34:31 This portion of Talk Python To Me is brought to you by Agency. + +00:34:34 The Agency, spelled A-G-N-T-C-Y, is an open source collective building the Internet of Agents. + +00:34:40 We're all very familiar with AI and LLMs these days, but if you have not yet experienced the + +00:34:46 massive leap that agentic AI brings, you're in for a treat. Agentic AI takes LLMs from the world's + +00:34:53 smartest search engines to truly collaborative software. That's where agency comes in. Agency is + +00:34:59 a collaboration layer where AI agents can discover, connect, and work across frameworks. + +00:35:05 For developers, this means standardized agent discovery tools, seamless protocols for interagent + +00:35:11 communication, and modular components to compose and scale multi-agent workflows. + +00:35:16 Agency allows AI agents to discover each other and work together regardless of how they're + +00:35:21 built, who built them, or where they run. + +00:35:24 And they just announced several key updates, including interoperability with Anthropics, + +00:35:29 Model Context Protocol, MCP, a new observability data schema enriched with concepts specific to + +00:35:35 multi-agent systems, as well as new extensions to the OpenAgentic Schema Framework, OASF. + +00:35:43 So are you ready to build the internet of agents? Get started with Agency and join + +00:35:47 Crew AI, LangChain, Llama Index, BrowserBase, Cisco, and dozens more. Visit talkpython.fm + +00:35:54 slash agency to get started today. + +00:35:56 That's talkpython.fm/agency. + +00:35:58 The link is in your podcast player's show notes and on the episode page. + +00:36:02 Thank you to the agency for supporting Talk Python and me. + +00:36:06 There's certainly areas where the cloud can go like sideways. + +00:36:10 In the book, I mentioned a story about Kara, I believe. + +00:36:13 And that was this project that this woman, I think in Hong Kong or South Korea, + +00:36:19 I'm afraid I can't remember which, I think it's South Korea. + +00:36:22 Anyway, created this. + +00:36:23 she's a photographer and really hates ai generated art so created this service that would say hey + +00:36:29 give me a piece of art and i'll tell you if it's ai generated or not or something vaguely like this + +00:36:35 and her thing took off in the app store and was like number six and her cloud bill at versell was + +00:36:41 ninety six thousand dollars in a week right as not as a business just as a human who built something + +00:36:47 fun as a side project like oh my yeah and in fairness to a lot of those tools they tend to + +00:36:52 have ways of saying, please limit this and do that. + +00:36:55 Yeah. + +00:36:55 They're not the default. + +00:36:57 And if you're not thinking about that problem and protecting yourself, you know, you'd, you'd + +00:37:03 kind of hope that it would be the other way around. + +00:37:06 It should be, yeah, you know, here's your cap. + +00:37:08 And if you want more than that, you need to do something about it. + +00:37:11 Yeah, absolutely. + +00:37:12 And to their fairness, they did send her a message saying your bill is going way higher + +00:37:17 than you might expect. + +00:37:18 And she didn't look at her email or something, but still. + +00:37:22 And so one of the things that really appeals to me is when you choose something like a + +00:37:27 Hetzner or a DigitalOcean or something, and you say, I'm going to pay for the server. + +00:37:31 You're like, okay, that's $40 I'm committing to. + +00:37:34 Maybe double that, you know, whatever. + +00:37:37 But the bandwidth is basically free or it's included, right? + +00:37:40 But for 40 bucks, it feels free. + +00:37:42 And then it's only going to cost as much as it costs. + +00:37:44 You might have to go, oh my gosh, it's too much traffic. + +00:37:47 We're going to have to deal with it. + +00:37:48 but it's the upper bound of those systems these days is so high. + +00:37:53 It is so high that we, you know, back to that aspirational thing that you mentioned, right? + +00:38:00 Like the chances that you blow past what a $50 server can handle, + +00:38:05 you're going to be really, really popular with the SaaS or something. + +00:38:10 You were joking earlier about SQLite. + +00:38:13 It's gotten so much better as well. + +00:38:14 And I'm not saying it's the answer to everything, + +00:38:17 But I could probably come pretty close to running your site now too. + +00:38:21 Like it's scary between the processor improvements and the improvements in the software. + +00:38:27 It's made a big difference for that kind of stuff. + +00:38:30 It takes very, very little hardware to handle something that's pretty, pretty impressive. + +00:38:36 Yeah. + +00:38:38 So I think the title of chapter four is one of my favorites. + +00:38:41 It gives a little hint as to what approach you took. + +00:38:45 The title is Docker, Docker, Docker. + +00:38:48 so what approach did you take you know what i was really not wanting to do docker on genuinely i mean + +00:38:53 that and so what i did when i originally switched over to some vms i'm like okay the story i'm told + +00:38:59 of what the cloud is you know i bought the aw the ec2 story well we've got all this extra capacity + +00:39:05 oh instead of getting like really expensive heavy metal you know big metal sort of servers you get + +00:39:11 a bunch of small ones and they're kind of like cheap and you just make them throw them away + +00:39:15 whatever right so I went and made a bunch of small servers in in DigitalOcean I + +00:39:20 think I had eight servers at one point and I thought this is gonna give me lots + +00:39:22 of isolation if I got to work on this one thing it won't mess with that and + +00:39:25 what I realized is they're interconnected enough like that really I + +00:39:30 end up just having to reboot eight servers in an orchestrated way than + +00:39:34 managing I'm like this is just worse I gotta patch eight servers instead of one + +00:39:38 now because this is not better so how do I end up with docker docker docker I + +00:39:42 I realized that it would be better to just have one server and basically stepping back just a little bit. + +00:39:48 Like, what if you could completely isolate yourself from almost all the complexity of the cloud and all of their services and all that junk and just say, I have a place where I can run apps that's got enough capacity that I can just keep throwing more apps in there if I want. + +00:40:04 And it doesn't have any tie in with the specific APIs and services of a particular provider. + +00:40:10 So I said, well, what if I just get a big server and I just run all my apps in there? + +00:40:15 And if I want a database, I put the database there. + +00:40:17 If I want like a Valkey caching, I can put a Valkey caching and things like that. + +00:40:22 And that's sort of as much autonomy as I can exert on running something in the cloud. + +00:40:27 It's almost like I went and got a big machine and stuck it in my closet. + +00:40:31 But that's insane because you get million dollar networking equipment and, you know, failover. + +00:40:36 But that doesn't mean you have to go fully pass managed database, this other service. + +00:40:41 Like you could just say, just give me a Linux machine where I can then go do my, do my, + +00:40:46 my hosting and all my apps and let them party and talk to each other and stuff in there. + +00:40:52 Right. + +00:40:52 So then I thought, well, I don't have all these little servers for isolation. + +00:40:57 I'm not really sure I want to throw all this random stuff together, like completely just + +00:41:02 in the same soup in that one big server. + +00:41:05 And by the way, the big server right now that it's running + +00:41:07 has eight cores, 16 gigs of RAM, and costs $30. + +00:41:11 Right. + +00:41:11 It comes with two terabytes of four terabytes traffic, + +00:41:13 something like that. + +00:41:14 Lots. + +00:41:15 $400 of included bandwidth for $30. + +00:41:18 So I said, well, what if I took that over? + +00:41:19 I think autonomy is a big motivator of this whole journey as well. + +00:41:24 Like, I don't want to be tied into all these different things. + +00:41:27 I just want a space where I have reliable compute and networking + +00:41:30 and Linux, and I can just do whatever. + +00:41:32 So then I said, all right, well, I better figure out some of the stuff with Docker just so that there is some + +00:41:38 isolation of all the different pieces living in the same space. + +00:41:41 So I forced myself to learn Docker and what it occurred to me was, Oh, + +00:41:45 Docker is just writing down in a file what I would normally have to type in the + +00:41:50 terminal to make the thing happen. + +00:41:52 Except for I put run in front of the command or copy instead of CP and I + +00:41:58 get a repeatability and, and, and someone else has packaged a bunch of this stuff. + +00:42:02 so you don't have to do it yourself. + +00:42:04 Exactly. + +00:42:05 And I'm like, okay, I don't know what all my concern was about + +00:42:08 because it's not much more complicated. + +00:42:10 One of my concerns was sort of monitorability. + +00:42:13 Like if I just go there and I just create a bunch of virtual environments + +00:42:16 and run my code, I can actually go and see the source code. + +00:42:19 I can see the config files. + +00:42:20 I can see where the logs are being written and I can sort of manage it through SSH. + +00:42:25 And I thought, well, if I put a bunch of disconnected Docker things together, + +00:42:28 that's going to be challenging. + +00:42:29 And I realized actually, not really. + +00:42:32 like if you set it up the same you could still tail all the logs and you can you can ssh into + +00:42:38 the containers if you really have to you know look at something running inside them like what is the + +00:42:44 process actually i don't know what does it do how much memory is it using relative to other stuff + +00:42:48 but and i also talk about a bunch of tools for monitoring them yeah so how has that changed over + +00:42:53 time like you started with some fairly bare bones you've got you've got some extra tools what what's + +00:42:59 What does that evolution look like? + +00:43:01 Well, I used to rely more on whatever the cloud provider, + +00:43:07 DigitalOcean or Hetzner, offered. + +00:43:09 You know, they always have like a metrics section. + +00:43:11 So I can go see, well, what's the CPU doing? + +00:43:13 What's the memory looking like over time? + +00:43:16 And that works okay. + +00:43:19 And if you've got one app that you're running there, you're like, okay, well, that must be how much memory the app is using. + +00:43:24 But right now, if I go to Talk Python and I ask, I think there are 27 different containers running, + +00:43:31 which you can't ask how's the server doing. + +00:43:34 I know very much. + +00:43:35 You know, it really matters much more. + +00:43:36 Well, it's busy. + +00:43:37 I get it. + +00:43:38 But which one is the problem? + +00:43:39 Which one is busy? + +00:43:40 Which one's using all of them? + +00:43:41 So I started to look around and there's actually a bunch of recommendations + +00:43:45 that I have for the book. + +00:43:47 So one of them, the first one I used was this thing called Glances + +00:43:51 and Glances is excellent. + +00:43:52 And by the way, Glances, the way they talk about often getting it, + +00:43:56 I think, where do they talk about installing it here? + +00:43:59 Probably is often like apt install glances or something like that, right? + +00:44:04 But a lot of these tools even have Docker versions. + +00:44:07 If you share the volumes and sockets just right, they function just the same. + +00:44:11 So you could say Docker run glances XYZ and it doesn't even install, + +00:44:16 it doesn't even touch your one big server that is kind of like your building space. + +00:44:20 So it leaves it a little more pure, right? + +00:44:22 So glances is super cool. + +00:44:23 And what it does is it gives you this really nice dashboard of what's going on with your app, like your server. + +00:44:33 How much memory is being used? + +00:44:34 How much CPU is being used? + +00:44:36 How has that been over time? + +00:44:37 Has there been like extended spikes and so on? + +00:44:40 And one of the things that's new to Glances, and I don't think it's in this picture that's on their home screen. + +00:44:46 I'm pretty sure it is not. + +00:44:48 Oh, no, it is. + +00:44:48 It just, mine is inverted because I have so many. + +00:44:51 It has a container section. + +00:44:52 So when you run it, it actually shows you not just the processes, but also gives you a performance, memory, IO, etc. + +00:45:00 for all the running containers and their uptimes and those kind of things. + +00:45:04 So this is super cool. + +00:45:05 So you construct a certain Docker command and then you have this running dashboard that just goes. + +00:45:11 So this is the first thing that I started with and I really like that. + +00:45:14 But then I also found something called BTOP. + +00:45:16 Are you familiar with BTOP? + +00:45:18 No, I haven't used this one. + +00:45:19 Oh my gosh, BTOP is incredible. + +00:45:21 This is so good. + +00:45:22 it's really something. + +00:45:25 Zoom in on it. + +00:45:26 So this gives you moving graphs over time of all sorts of things. + +00:45:31 It shows you graphs of network, inbound, outbound traffic. + +00:45:36 It shows you the CPUs. + +00:45:37 It gives you a breakdown of like, here's all the different distributed work + +00:45:41 across your eight CPU cores and over total. + +00:45:44 It's really something else. + +00:45:45 And so this one is really nice. + +00:45:47 You can configure the UI a lot to show and zoom in on disk activity or whatever. + +00:45:53 This is really a nice way to view it. + +00:45:55 And again, when you run all these Docker containers, they feel like they're super isolated and tucked away. + +00:46:00 And from their perspective, they are. + +00:46:02 But when you look in the process list here, it just shows the process that Docker is running. + +00:46:06 So I have all my web apps and APIs and stuff setting a process name. + +00:46:11 So instead of just Python, Python, Python, Python, Python, + +00:46:13 it'll say like, talk Python, Granian Worker 1, talk Python, Granian Worker 2. + +00:46:19 Versus indexing service daemon. + +00:46:22 And then when you look into any of these tools, you can see, oh, exactly what is busy. + +00:46:26 And those are actually the names inside of Docker, but they still surface exactly like + +00:46:31 that to all these tools. + +00:46:32 One of the things you said kind of hit home for me, like it was subtle and it kind of + +00:46:37 moved on, which was like, if you interconnect it correctly, right? + +00:46:41 Like if you get the files and sockets going, this goes smoothly. + +00:46:45 And I think it's one of the things you've done very, very well in the book is sort of + +00:46:49 through that, like, as you talk about the different Docker configurations, like, okay, + +00:46:53 well, this is why we're putting this here rather than in the container, this is going to be shared. + +00:46:57 And, you know, and, and there's, and the reason for this, I assume some of that was experimental. + +00:47:03 You just sort of over time, you kind of went, oh, okay, wait, I need that somewhere else. + +00:47:08 Yeah. + +00:47:08 Or, or was it, you know, did, was there somebody's knowledge that you, depended + +00:47:14 on a lot there? + +00:47:14 How did you get there? + +00:47:16 How, how organic was the journey? + +00:47:18 the, I would say half and half, like some of it, for example, the glances stuff, I just found + +00:47:25 when I went looking for it, that there was like ways to install. And I said, it just said, Oh, + +00:47:29 you could just install it by running this Docker thing. And it's like a big, long command. And I'm + +00:47:33 like, Oh, that's cool. Because it doesn't matter how long the command is, what I would do is I'll + +00:47:39 go into my dot ZS HRC and say alias glances equals paste. And then I saved that somewhere. And I + +00:47:46 never I couldn't tell you what it is at all I just know it has to like do a yeah it has a few things + +00:47:51 so it can like look back into the host to see you know what's running and so on um yeah so a lot of + +00:47:57 it was was like that and then some of it was definitely you know two whole days of like why + +00:48:03 won't that talk to that let me build it again let me do some more devote you know what I mean and + +00:48:07 eventually like okay all right but once you get it kind of dialed then it's once you get a little bit + +00:48:11 of it working it's a blueprint you just like again again again so so you seem to have taken a bit of + +00:48:16 a like a heavier weight approach here you're just it's it's everything in the kitchen sink that that + +00:48:23 implies that it's not the right amount but it's counter to some of the advice that's out there + +00:48:28 uh sometimes folks talk about you know wanting to have things as minimal as possible why why what + +00:48:34 you've done versus the other how how how are they wrong can we start a flame war on the internet you + +00:48:40 Let's do it. Let's see how many, Michael, you're wrong. I can get into the YouTube comments. + +00:48:45 Actually, please, that's not a challenge. So here's the deal. I want, especially at the + +00:48:53 beginning of this journey, when I was like, I want as much comfort and visibility as I can get + +00:48:59 in these containers and other areas. You know what I mean? And I wanted to make it as close to, + +00:49:04 if I just set up a VM and just, you know, uv, V, ENV, and just roll from there, right? So what I did + +00:49:10 is I said, okay, I could try to go for like the super slim Docker image, or I could just get like + +00:49:17 a basic Docker image, but then put, you know, I put like, oh, my Z shell and ZSH on it, right? + +00:49:23 Does that need it? No, you could use SH, but do you know what happens when you use SH and you go + +00:49:28 in there? It's a brand new start. It doesn't remember anything you've ever, any command you've + +00:49:32 ever run, it doesn't give you any help. You know, you hit tab, it doesn't tell you nothing, right? + +00:49:37 You're like, oh gosh. But if you use like, oh my Z shell, it'll show you, hey, what version of Python + +00:49:42 is your virtual environment activated in? And I can just hit command R and see, you know, filter all + +00:49:48 my history and I can hit get tab and it'll auto complete all the get commands that I forgot what + +00:49:53 I was supposed to use because I'm freaked out because the site is down and how do I fix this? + +00:49:56 I mean, I wouldn't actually be in the container for that, but a lot of times you're in there kind + +00:50:01 of exploring because you're like, it's been fine for six months, but I need to see something. + +00:50:05 And so in the book, at least in the beginning, I recommended to people that they install + +00:50:11 some of these tools that you might install into your own terminal to make you feel more comfortable. + +00:50:15 So that my assumption is you're kind of new to Docker. You're feeling a little uncomfortable. + +00:50:20 Like who cares if it's another hundred megs on your hard drive? You're not shipping your app + +00:50:27 to Docker Hub. You're not going to take your web app, probably. It's not a reusable component. + +00:50:33 You've got your source code and you want the thing to just run here. You're not shipping it. So + +00:50:38 whoever wants to run, you know, indeed.com can just Docker pull that and run it. Like it's, + +00:50:43 that's not what it is. And in that context, you're not so worried about the space. And there's a + +00:50:48 couple of tips that you can use for like really, really fast build times, right? So I mean, like + +00:50:53 container build times for me are like seconds, a few seconds, even though, you know, there's 250 + +00:50:59 pip install packages for Talk Python training, you know, build times, and build time is also + +00:51:06 installing Python, right? You could make these things fast, so it's not like a huge impedance, + +00:51:11 but I think for people who are new to it, having something other than just sh, not even bash, + +00:51:17 you're a lot better off. So that's what I promoted. And I think it's, you know, it kind of comes back + +00:51:21 to sort of the thesis of the book as well, right? + +00:51:23 Which, which is right for you. + +00:51:25 so, you know, if you are going to be running a thousand of these spread + +00:51:30 across a whole bunch of different cores, then yeah, if you optimize this, that + +00:51:34 might change your cost framework and everything else. + +00:51:37 Well, right. + +00:51:37 Or if I was, if you're 27, if you're 27 on eight CPUs works fine, then, you + +00:51:44 know, go for it. + +00:51:45 You know, why, why, why get in your own way? + +00:51:48 And that advice is not like, this is why I really emphasize the + +00:51:51 context sort of thing, right? + +00:51:53 Is this advice is bad if your goal is to ship a container to Docker hub so that people can + +00:51:58 self host your open source thing. + +00:52:00 You don't want that to have extra crap that they don't need. + +00:52:03 But when there's only one of them for your machine and you're building it and you're managing + +00:52:08 it, you know, make it as comfy and helpful as possible. + +00:52:12 That was my philosophy. + +00:52:13 The, the structure of your site is, it has a lot of different pieces to it using + +00:52:20 different technology. You spend some time talking about like static sites and using static sites for + +00:52:25 part of it versus, you know, Python applications and those kinds of parts. How did you end up here? + +00:52:33 Like oftentimes the answer when you're looking at this kind of stuff is, well, I need a CMS for + +00:52:37 everything. And then I will try to figure out how to square peg my round hole of a CMS or whatever. + +00:52:43 So how did you end up with a collection? Well, you know, like many things that start simple, + +00:52:48 And you're like, well, just one more thing. + +00:52:50 So I tried, I'd kept pretty much the same web framework across all my different sites thinking, + +00:52:55 okay, that's, I'm going to just pick one and go with it. + +00:52:57 I think a lot of people do that. + +00:52:59 You know, there's people who are like, I use Django. + +00:53:00 There's people, I use Flask and so on. + +00:53:02 And then just slowly over time, you're like, really, this is, this part is really a hassle. + +00:53:08 I'd be a lot better off if I made that part served through the CDN or why am I, you know, + +00:53:14 One of the things that I see a lot, and it doesn't drive me crazy, but I'm just like, + +00:53:19 yeah, it's probably not necessary, is a lot of people in technology X. + +00:53:24 For us, that's Python. + +00:53:25 It could be JavaScript. + +00:53:26 It could be.NET, whatever, right? + +00:53:28 People who work extensively in that and have a lot of their identity tied into that, like + +00:53:32 I do and others. + +00:53:34 Like, I'm a Python developer, right? + +00:53:36 So if I'm going to choose a tool, like, let's say, a CMS or a static site generator or something + +00:53:41 like that, I'm going to choose the Python one. + +00:53:43 I'm a Python person like okay but are there better options out there than the Python ones for what + +00:53:48 you're trying to do because are you going to extend this tool no then what do you care of + +00:53:52 what it's written in right your operating system is not written in Python it's written in yeah yeah + +00:53:57 yeah C or I'm not going to use this word processor because it's not written in Python + +00:54:02 exactly so I have to go back in no I need a new service like it doesn't you don't see it you don't + +00:54:07 have to work with it you don't care and so I ended up a little bit with the mish match of just trying + +00:54:11 to say like what are the best tools like for example for the blog and some of the other static + +00:54:16 elements i've used hugo which is written in go it's like okay i type the command the command hugo + +00:54:22 you know and it it does its thing i don't really care what it's written in the templating extension + +00:54:26 is a little bit annoying but um i kind of just went around and said okay well what what do i think + +00:54:31 would be the best fit to make to make my life easy not to reinforce my identity as this type of + +00:54:37 of developer or that type of developer you know yeah it's um the one of things you know i'll show + +00:54:45 my own stripes here and and you can defend your beloved flask if you like but having come from + +00:54:51 the django side uh some of the things that you've kind of learned organically here are forced on you + +00:54:57 in django um so when you so so when you when you like the instructions for putting together a + +00:55:03 production site are, and you will run this command and it will move all of your static content over + +00:55:07 here. So like your mental model, when you come from that side is, oh, my site is actually built + +00:55:12 of these, at least these two different things. And I think coming from Flask, you might've sort of + +00:55:18 that, that discovery might've been a little more organic. You might not have been forced into it + +00:55:24 immediately, but once you've come to that realization of, oh, wait, I have these pieces + +00:55:29 and I can use something like Nginx to tie it all together, + +00:55:33 that means, well, then I don't have to figure out + +00:55:35 how to use a CMS for this thing that's very unnatural for a CMS. + +00:55:39 I can just mount it under slash blog and it'll work fine, yeah. + +00:55:44 Yeah, Django is very powerful. + +00:55:46 It definitely is. + +00:55:46 And I actually talked a lot about that in the book, which evaluating web frameworks. + +00:55:52 But I would say before we, if we're going to that, but before we do, I think your point about using Nginx + +00:55:57 to piece them together, things together or caddy or traffic or whatever it doesn't matter like some front end web server + +00:56:03 they all do it yep yeah is so often people think i have this python app let's say i have a django + +00:56:10 app so i want to add a cms to it what could i possibly add is it static content well maybe what + +00:56:15 you should add is hugo i don't know i'm just making this up right like it might actually be a bad + +00:56:19 option but well hugo is not a python thing so how do i put it into my django app i mean they're very + +00:56:24 very different in the way they work so they don't really go that super well together if you were to + +00:56:28 like how does one literally source code wise go into another but if you just made like a hugo site + +00:56:35 or other static site however you make it and then put it on the same server and then in nginx when + +00:56:40 you say if you go to this domain slash docs it goes completely over here and if it goes anywhere + +00:56:46 else it goes just to django and all of a sudden from the outside it looks like a very cohesive + +00:56:52 single thing with just different sub urls but you get to choose the best technology for the static + +00:56:56 bits and you get to choose the best technology for your dynamic data-driven bits and that is all just + +00:57:02 done by configuring the front-end web server that you don't even have visibility to in python and i + +00:57:08 think that that's a big mental shift but it's like those kinds of things that bring a both the + +00:57:13 flexibility to make these choices and the simplicity to not try to jam them together + +00:57:17 there's a uh there's a a third-party library for django that i use once in a while which is called + +00:57:22 distill and it's a static site generator based on django so say you had your url was like books + +00:57:29 you know it's books slash one book slash two book slash three well you tell distill i want book and + +00:57:35 i want all the possible queries of this number and it will generate the results as static so even + +00:57:42 when you've got a dynamic site you can actually carve off the static portion and then have that + +00:57:47 fed straight out of nginx and if it isn't if there's no actual dynamic content on the page and + +00:57:53 if it only updates when the database updates or something like that and you can do it nightly like + +00:57:57 this gives you all sorts of other options and you know to come back to your my eight processor + +00:58:02 whatever's well the static sites are almost free you do not even need that you don't even need it + +00:58:09 it's nothing so like you can scale way down and have an absolutely mammoth site just by properly + +00:58:15 fine-tuning what's dynamic and what's static yeah you could go to millions and millions of requests + +00:58:20 if you just converted all that stuff to static and then put the x the extra resources css image + +00:58:28 javascript etc on a cdn yeah like i mean that is like almost web that's like web on easy mode right + +00:58:34 there because you it can't go down unless the server literally almost almost can't yeah i mean + +00:58:41 Let's not say we're in the eve of like, are we in the eve of a fifth going down? + +00:58:46 But I mean, like it can't go down because the code is wrong or there's a race condition + +00:58:51 or you're out of memory. + +00:58:52 Like it's really close to just if, if the, if the web server is up and you put a CDN + +00:58:57 in front of it and then it's not even necessarily that it's like the CDN has to go. + +00:59:02 You've got to have a cloud flare level incident. + +00:59:04 Yeah. + +00:59:04 And, and fully distributed in often cases. + +00:59:07 Right. + +00:59:07 So people in, you know, in continents, other than where you are based are getting fantastic load times because it's cached locally for them. + +00:59:17 Yeah, I just want to give a little shout out. + +00:59:19 I'm going to give a little shout out to Bunny.net. + +00:59:21 Like, I know people are all about Cloudflare, but this is a cool European company that focuses on privacy, has some really nice features. + +00:59:29 The pricing is great. + +00:59:30 And they have, you go here, go to the CDN. + +00:59:33 They've got somewhere way down here, you know, like 119 different places, including all over Africa. + +00:59:41 And this is just super, super cheap for traffic. + +00:59:45 Nice. + +00:59:45 Yeah. + +00:59:46 So I wasn't, earlier there, I wasn't intending to force a fistfight. + +00:59:52 And, you know, we're on the opposite side of the continent, so that would be a challenge when I, Jango versus Flash, go. + +01:00:00 But, you know, I think one of my favorite chapters was actually chapter 13, which was titled Picking a Python Web Framework. + +01:00:07 I really liked the nuance of this. + +01:00:11 It's unusual for folks to sort of reveal their reasoning. + +01:00:16 And honestly, I think, like, because I had no intention of tomorrow going and using Docker, the Docker chapters were interesting because I like to see how other people do things. + +01:00:26 But like I could grab the picking a Python web framework, pull that chapter out and hand it to almost any of my students. + +01:00:33 Right. Like it's this, you know, how do I make these kinds of decisions? + +01:00:37 How is this different? Why do I think about these things? + +01:00:40 And so often the content on this is really just religious war. + +01:00:44 And you've done like a really, really good job there of just sort of conveying this, you know, hey, here are pros and cons for each. + +01:00:51 And this is why I picked this. + +01:00:53 And so I really, really liked how you, how you covered that. + +01:00:58 Thank you so much. + +01:00:59 What you, I guess it's maybe the answer is obvious, but, but why? + +01:01:05 Like you were fine. + +01:01:07 You were just doing configuration file after configuration file and then a little bit editorial. + +01:01:12 What, what caused you, what, what was the emphasis for the impetus for spicing it up a little? + +01:01:18 Well, I mean, I think it's an important part of the journey is, is picking a technology to run your code on. So there's actually a couple of places that I kind of have that like I have that for I'm trying to create a term because I don't think we do a good job of disambiguating it from from like engine X of Python, like production app servers, like where your code runs, I think these a little more disambiguation, I'm talking like granian, unicorn, those kinds of things. + +01:01:46 Vercorn, Uveacorn now, all those places you run your Python code. + +01:01:50 So I kind of went through a debate on those from Michael's context, right? + +01:01:55 And then I did the same for Python web frameworks. + +01:01:57 And it was, you know, I told the story of the bootstrap and how I just, every time I have + +01:02:02 to write new code, I'm like, here we go. + +01:02:04 I'm in the relic, right in the relic code. + +01:02:08 I kind of felt the same way for, so everything was based on Pyramid and I loved Pyramid and + +01:02:13 I still have a lot of respect for it. + +01:02:15 The reason I chose Pyramid in 2015 was when I went to the Flask page, it said, you may potentially be able to use Python 3, but we are not supporting it and we don't recommend it. + +01:02:27 And I'm like, wait a minute, didn't Python 3 come out in 2008? + +01:02:31 That's like seven years later. + +01:02:33 You know what? + +01:02:34 No, I'm not doing that. + +01:02:37 I'm starting this project beyond this problem and I'm not going back to be in the, you know what I mean? + +01:02:44 As they've since obviously moved on from that. + +01:02:46 So Flask was out. + +01:02:48 I looked at Django and I thought, I'm really like a microservice guy. + +01:02:51 I really want to use Mongo. + +01:02:52 A lot of things were not quite good fits. + +01:02:54 They actually would be better fits now, right? + +01:02:56 Even then. + +01:02:57 Yeah, no, if you want to do Mongo, that's, yeah, that's almost a deal breaker. + +01:03:02 Yeah. + +01:03:02 Yeah, I know. + +01:03:03 Almost. + +01:03:03 And so I'm like, all right, well, maybe not Django. + +01:03:06 Well, you had a pyramid. + +01:03:07 They're like, we are trying to embrace the latest standards. + +01:03:11 We're Python 3 first, et cetera, et cetera. + +01:03:13 And I'm like, all right, I'm gonna give this a chance, + +01:03:16 even though it wasn't as popular, like this is great. + +01:03:17 And I used it for eight years, seven years, something like that, it was really good. + +01:03:22 But things evolved over time, right? + +01:03:24 Like Pydantic came out and Pydantic was great. + +01:03:27 What's a really nice way to talk to the database with Pydantic, Beanie, okay? + +01:03:32 So I can do Beanie and I can do Pydantic and wow, what a really nice, clever way to write databases. + +01:03:38 And oh, Beanie's all async only, Pyramid's synchronous only. + +01:03:43 When was the last commit to Pyramid? + +01:03:45 Oh, it was two and a half years ago. + +01:03:47 Chances that it gets async support are low 'cause that was just like a minor bug fix. + +01:03:51 You know what I mean? + +01:03:52 It's just like, it's fine. + +01:03:54 Open source projects, they ebb and they flow and they come and they go. + +01:03:57 But I'm just like, I should really move this forward to something that feels like it's active, right? + +01:04:03 I mean, stuff in the web makes me nervous. + +01:04:05 I'm always just, did you put a port open on the internet? + +01:04:08 Well, that's scary. + +01:04:09 - Yep. + +01:04:10 And so a framework that felt like things were not on top of it as it could have been made + +01:04:16 me nervous. + +01:04:17 To be fair, I don't know that they had any security incidents or very, very few because + +01:04:21 it did so little, right? + +01:04:23 It's not like it had a bunch of admin pages or something where there's like accepting + +01:04:27 input, but still, still same reason. + +01:04:29 So I'm like, I really want to use these more modern tools, typing, async, identic, et cetera. + +01:04:37 And I kind of would not like to continue building on something that feels like it's no longer being supported. + +01:04:42 And similarly, you, with chapter 13, sort of that, you know, the different thought process there. + +01:04:49 You also provide chapter 15, which is a retrospective on Hetzner, which is the hosting provider that you chose. + +01:04:55 and again, I think it's pretty clear. + +01:04:59 I think I've said it three different ways. + +01:05:01 My favorite stuff in the book really is sort of this, you know, that the little, the little, insight into Michael's brain, right? + +01:05:08 Like how did he make this decision and how happy is he with these decisions? + +01:05:12 Right. + +01:05:13 I think that's the stuff that's, that's, globally applicable to a reader, which is nice. + +01:05:18 so you've, it's now even a few months further on with Hetzner. + +01:05:22 So you, you still happy? + +01:05:24 Any regrets yet? + +01:05:26 Yeah, no, no regrets. + +01:05:28 It hasn't been absolutely a hundred percent smooth. + +01:05:30 Let's see. + +01:05:31 I could tell you how long it's been if I can get past all the ads. + +01:05:34 There we go. + +01:05:35 So I actually blogged about this. + +01:05:38 And yeah, so it's been about a year, I guess. + +01:05:42 No regrets. + +01:05:43 I would say if people are out there looking around, to me, it really, + +01:05:48 and you want to follow the philosophy of Michael, like carve yourself a space out in a multimillion dollar data center + +01:05:54 that doesn't have anything to do with it. + +01:05:55 And you just run your code in your own space. + +01:05:58 DigitalOcean and Hetzner are the two ones. + +01:06:01 And I did DigitalOcean for a long time. + +01:06:03 When Hetzner came out, I thought they just had some really interesting appeal. + +01:06:07 I started seeing a lot of people talking about them. + +01:06:09 And they are a German company. + +01:06:12 And they were just in Europe. + +01:06:14 And I'm like, I love Europe, but the majority of my customers are in the U.S. + +01:06:17 So what is the best place for my server? + +01:06:21 Probably the east coast of the United States, because that serves the U.S. really well. + +01:06:25 But then it's like one hop across the ocean to all of Europe as well. + +01:06:28 So it's still really fast from there and so on. + +01:06:32 So I didn't want to move my server to Europe when I felt like being closer to the US was more important. + +01:06:38 Not so much because I needed to manage it. + +01:06:40 I could SSH to wherever, but just East Coast to the US. + +01:06:43 And then they're like, hey, we have two new US data centers. + +01:06:47 One near Virginia, right by the US East 1, the infamous AWS US East 1. + +01:06:53 And the other one actually in Hillsborough, Oregon, just down the street from me, which is funny. + +01:06:58 Yeah, it's like I could drive to it in like 20 minutes, + +01:07:02 which of all places in the world is relatively close. + +01:07:04 So I went and looked at it and I said, let me just check it out. + +01:07:07 And the prices are super cheap. + +01:07:09 You get a little bit less support and I think a little bit less top tier data center + +01:07:16 than DigitalOcean, but the prices are like insane there. + +01:07:19 Like I said, eight core server for 30 bucks. + +01:07:23 You know, that's insane. + +01:07:25 And when I first signed up, that came with 20 terabytes of free traffic. + +01:07:30 Wow. + +01:07:30 Which is about $1,700 out of AWS. + +01:07:36 Right. + +01:07:36 Included in your $30 bill. + +01:07:38 You know what I mean? + +01:07:39 Like, oh my gosh. + +01:07:41 Yeah. + +01:07:41 So yeah, I talk a lot about it in the book, but yeah, I went over and moved my stuff over there + +01:07:46 and it's been good. + +01:07:48 I've had one incident where the machine that it was on died. + +01:07:52 The one big server, wherever it was, it died and they had to move it, + +01:07:57 which blows my mind, they were able to hot relocate it to another server. + +01:08:01 But the problem is it has an external, like a 300 gig external drive, + +01:08:06 and that didn't move location. + +01:08:07 So all of a sudden, a lot of the IO operations were much slower 'cause they weren't close + +01:08:12 to the server anymore. + +01:08:13 - Right, right. + +01:08:14 - Why, why did my Docker builds take two minutes? + +01:08:17 They used to take about three or four seconds. + +01:08:18 I cannot figure it out. + +01:08:20 And I wrote them, they're like, no, we've tested it. + +01:08:22 There's no problem. + +01:08:22 I don't care what you say, there's a huge problem. + +01:08:25 eventually there's like we moved it again it's fine and then it was fine right so you know if + +01:08:31 if folks are looking for something slightly lighter weight and this is going to sound like a commercial + +01:08:37 i'm just a happy customer i don't know sponsorship or whatever but i use opal stack with a lot of my + +01:08:42 clients um what do you go opal stack opal um yeah and you wouldn't go full docker with it um but + +01:08:49 but they do give you access full SSH to the box. + +01:08:54 And they've got a neat little sort of packaging thing. + +01:08:56 They don't support a lot of things, but if you've got like Django or Flask + +01:09:00 or static files for Nginx or whatever, you hit a couple of buttons in the dashboard + +01:09:04 and it spawns it up. + +01:09:06 But a lot of tools like that, it spawns it up and then you're not allowed to touch it. + +01:09:10 What they do is create all the entries in the directories + +01:09:12 and then you can SSH into the box and get at the files themselves. + +01:09:15 So I find it's a nice little compromise between the two. + +01:09:19 It would not scale to what you're doing. + +01:09:22 But if folks are looking for a relatively inexpensive thing to experiment with, I find it's a nice little stopgap. + +01:09:28 Yeah, that's awesome. + +01:09:29 I'm always interested in finding those types of things. + +01:09:33 This one is new to me. + +01:09:34 This is cool. + +01:09:34 Yeah, this is I'm trying to remember there was a site I used to use. + +01:09:39 It got bought. + +01:09:40 The half of the founders went, we don't want to be bought and took their baseball bat and created Opal Stack. + +01:09:46 So I used to be a client of the original and followed them along. + +01:09:50 So yeah. + +01:09:51 Cool. + +01:09:51 And very happy with like the service as well. + +01:09:54 They're like you open a ticket and things are very, very human, which is nice in this day and age. + +01:09:59 You usually, I'm usually expecting to talk to a bot. + +01:10:02 You're getting about as much support as you get out of Gmail. + +01:10:05 Yeah, exactly. + +01:10:06 Google Docs, which is done. + +01:10:08 Another thing worth a shout out here is sort of an alternate way of working with Docker and Docker Compose directly + +01:10:15 that I propose in the book is something called Coolify. + +01:10:18 Are you familiar with Coolify? + +01:10:20 No, this one I don't know. + +01:10:21 Yeah, this is super interesting. + +01:10:22 So what this does is it knows how to run Docker, Docker Compose, + +01:10:28 but it also gives you all sorts of easier ways. + +01:10:31 So if people look at what I'm proposing, they're like, no, Michael, too complicated. + +01:10:35 This is interesting because what it gives you is it basically gives you your own private Heroku + +01:10:40 or Netlify or Vercel or Railway, or you can go in, I don't know how to find it, + +01:10:45 from here, but you can also go in and say, let me find any self-hosted app. + +01:10:50 - Okay. + +01:10:51 - And they've got hundreds of them in there. + +01:10:52 And then you just type in the name and say, install this set of Docker containers + +01:10:57 as a Docker compose file into my server. + +01:10:59 So you could create the one big server, which is your own space in someone's cloud. + +01:11:04 And then you can install this or you can pay them five bucks a month + +01:11:07 and they'll actually manage the server, manage the deployments, + +01:11:11 do like rollouts of new versions of your app. + +01:11:15 stuff like that right it sounds like it sounds like it makes it way easier right it actually makes it + +01:11:22 it's like two steps forward 1.8 steps backwards right because you know instead of using dot env + +01:11:29 files you've got these like a ui to enter a bunch of environment variables and the saving of them + +01:11:35 is weird and you're like oh i forgot to press save on these three even though i saved the page i mean + +01:11:39 there's just right right it's it promises more ease than you would think and i'm not necessarily + +01:11:45 switch i do like i've played with it some i'm not saying i would switch to it given a choice + +01:11:50 but it does it does ease you in it's a little bit like python anywhere like i'm sure when i started + +01:11:55 there were things that could have gotten in my way but the stuff the support that it gave me made it + +01:12:00 possible for me to feel like comfortable and get going i feel like this might be an option for + +01:12:04 people who care. + +01:12:05 Right. + +01:12:06 But let me give you an example. + +01:12:07 For example, I could go install an app that has Postgres, Valkey, and the web app. + +01:12:15 If I, then I just click install that from wherever self-hosted definition that comes + +01:12:20 from, it creates those three containers. + +01:12:22 And then on the container setting through the image setting, I don't know really how + +01:12:26 you think of it. + +01:12:26 I mean, they're not, I guess it's image sort of, you can go to the web part and say, and + +01:12:31 just use this URL and it'll automatically do the SSL generation as part of that. + +01:12:36 Then you go to the database, the Postgres thing, you say, oh, and make backups for + +01:12:39 me daily and store them in this S3 compatible storage. + +01:12:43 And that kind of stuff is a lot of extra when you're doing it yourself and you + +01:12:47 just go check those boxes. + +01:12:48 So that's the one point forward, that's the two forward, but then there's the + +01:12:52 step back. + +01:12:52 Yeah. + +01:12:52 Yeah. + +01:12:52 Well, and that tends to be also what makes people nervous, right? + +01:12:56 So like that, and that's, you know, I still use managed database simply because I don't want to + +01:13:02 have to think about it, right? Like it's like, yeah, okay. I'm perfectly fine with pointing my + +01:13:07 app at a managed database and let somebody else think about backing it up and all the rest of it. + +01:13:12 Yeah. Yeah. You know, one thing about managed databases that I don't like, and I can't speak + +01:13:16 to all potential hosts of them, but certainly some of them, some well-known ones, some names I've + +01:13:21 already said, if you get a managed database there, that database server is listening on the public + +01:13:26 internet. I very much do not espouse having a database listening. Yeah, it has a password, + +01:13:34 but I mean, that's that database. I'm always worried about what is in the database. + +01:13:38 That's interesting. I've never thought to even check that. + +01:13:41 And on my setup, not only is it not listening on the internet, it's not even listening on the + +01:13:46 host there's like a private docker network that only things right in the docker shared docker + +01:13:52 network can even know about or see the data you know what i mean so there's yep it's it's less + +01:13:58 likely to fewer holes but i have to make backups and if i don't it's bad if it goes down it's real + +01:14:04 bad so and it did one time this year i was down for like 10 minutes about a hair out there goes + +01:14:09 - That was your sixth nine, yes. + +01:14:11 - Exactly, I know. + +01:14:13 So the problem was just for people who wanna benefit from my suffering and not suffer themselves, + +01:14:19 is I did not, on the Docker pull for the database image, + +01:14:23 I didn't pin it to a major version. + +01:14:26 And so it upgraded and then it said, well, you have old data in your file system + +01:14:29 and we're not gonna upgrade it for you automatically. + +01:14:31 So we're not gonna run. + +01:14:32 I'm like, why is the database server not running? + +01:14:33 It just, and it was like a weird update. + +01:14:36 It was like 8.2.1 that broke. + +01:14:39 Well, why? + +01:14:40 Point one. + +01:14:41 Surely, surely a bigger number needs to be like, this will never run again. + +01:14:49 Anyway, you know, you find the stuff out the hard way, but yes. + +01:14:51 Well, yeah, that that's the negative, not using a managed database. + +01:14:56 Yes. + +01:14:56 Yeah. + +01:14:57 Yeah. + +01:14:57 Cause you have to deal with some of that kind of stuff yourself. + +01:14:59 Yeah. + +01:15:00 So I thought we would wrap up by reviving an old tradition. + +01:15:04 I have two questions for you. + +01:15:06 What is, what, what, what is your development environment? + +01:15:09 and what library are you excited about? + +01:15:12 So the development environment right now is a mix of Cursor and PyCharm for sort of editing. + +01:15:21 And despite this very detailed conversation about Docker, + +01:15:24 I don't use Docker very much locally for development. + +01:15:28 I just use virtual environments. + +01:15:29 And I want to give a shout out to Hynek, who I had some back and forth about + +01:15:33 when I was writing some of this stuff that gave me some really good ideas. + +01:15:36 And he has a really good article, which I referenced in the book, + +01:15:38 about you just use virtual environments. + +01:15:41 Keep everything consistent, right? + +01:15:42 That's an interesting debate that we don't have time for, + +01:15:45 but it's very fun. + +01:15:46 So uv, I'm a huge fan of uv. + +01:15:50 - Particularly in Docker, that makes things that much faster. + +01:15:53 - Yeah, because you can just say in your Docker, it used to be you're like, okay, well, + +01:15:57 I gotta use Docker and I need to use Python. + +01:15:59 So let me use the official Python distribution for Docker + +01:16:01 because I need to have Python. + +01:16:03 And then, well, that excludes 99.9% have all the other things you could build upon + +01:16:10 that already have something that's harder to manage set up for you, right? + +01:16:14 But in your Docker file, you just say, run uvvenv-python3.14, + +01:16:20 you've installed Python 3.14 in two seconds. + +01:16:23 And it's cached, right? + +01:16:24 It's like, yeah, it just, it makes it so much faster and so powerful, but also just in general, right? + +01:16:30 Like it's unified so many tools that I like that are just, it's all there together. + +01:16:34 And then library, oh my goodness. + +01:16:37 - Now you know how it feels like i know how it feels and now i didn't warn you on purpose i love it i love it so there's + +01:16:45 there are a bunch of ones i've been playing with lately and i'm trying to think which one i've + +01:16:49 i've used i don't really have a great answer to this chris i'm i'm afraid to say i would say + +01:16:54 let's keep it let's keep it flowing with some of the vibes that we had here i would say + +01:16:59 let me give a shout out to set proc title which there you go sounds insanely silly like the goal + +01:17:06 of that is so in your process and your Python process and I actually use this + +01:17:10 on a bunch of different things in your Python process you can say set proc + +01:17:14 title dot set title and you give it the name of whatever you want your process + +01:17:18 to be so why does that matter when you pull up all these tools like glances be + +01:17:24 top or others anything that looks at processes basically instead of seeing + +01:17:28 Python Python Python node node node postgres postgres postgres at least the + +01:17:32 Python ones now have meaningful names and you might be thinking well Michael + +01:17:36 that much production, useless to me. + +01:17:38 No, it's good for development too. + +01:17:40 Have you ever had the idea, like I wanna know how much memory my process is using. + +01:17:46 Is it using a lot or a little? + +01:17:47 So you pull up, you know, activity monitor, task manager, whatever, you see Python, Python, Python, + +01:17:52 you're like, oh man, I know my editor's using one of these + +01:17:55 or whatever, but which one is it? + +01:17:57 - And if you're using the right terminal, it'll change the terminal's title too, + +01:18:02 because most terminals respond to the proc name. + +01:18:05 Oh, that's a very nice touch. Yeah. Okay. Yeah. So, but if you, if you do that in development, + +01:18:10 if you just set the name of your process to be like, you know, my utility or whatever the heck + +01:18:15 you call it, right. Then when you go into process management tools, like even just for Mac or + +01:18:19 windows or whatever, you'll see it and you can see how much CPU is it using? Is it using a lot of RAM? + +01:18:25 If you've got to end task it, like we now have another way reason that this is something we've + +01:18:29 got to do all the time is the, sometimes the agentic AI things go mad and they start a bunch + +01:18:35 servers and then they lose track of them and then you can't run anymore because it says um port is + +01:18:39 in use you're like but where like something in that stream of text that shot by for five minutes + +01:18:45 it started one and then it left it going but then you pull it up it says python python python and + +01:18:51 like well i don't want to kill the other thing that's running you know what i mean and so it also + +01:18:56 gives you a way to kill off your ai abandoned stuff that it went mad on so there you go setting + +01:19:01 a process name might save you a reboot. There's your little nugget to take away from the podcast. + +01:19:06 Exactly. It's a package with one function, but it's a good one. + +01:19:11 Excellent. Well, thank you for having me on. This has been fun to sort of reverse the tables on you. + +01:19:16 It's been great. + +01:19:18 Yeah. Chris, thank you so much. I really appreciate it. And always great to catch up with you. Bye. + +01:19:21 It's been fun to be here. + +01:19:23 This has been another episode of Talk Python To Me. Thank you to our sponsors. Be sure to check + +01:19:27 out what they're offering. It really helps support the show. Look into the future and see bugs before + +01:19:32 they make it to production. Sentry's Seer AI code review uses historical error and performance + +01:19:38 information at Sentry to find and flag bugs in your PRs before you even start to review them. + +01:19:44 Stop bugs before they enter your code base. Get started at talkpython.fm/seer-code-review. + +01:19:51 Agency. Discover agentic AI with agency. Their layer lets agents find, connect, and work together. + +01:19:57 any stack, anywhere. Start building the internet of agents at talkpython.fm/agency spelled + +01:20:04 A-G-N-T-C-Y. If you or your team needs to learn Python, we have over 270 hours of beginner and + +01:20:10 advanced courses on topics ranging from complete beginners to async code, Flask, Django, HTMX, + +01:20:17 and even LLMs. Best of all, there's no subscription in sight. Browse the catalog at talkpython.fm. + +01:20:23 And if you're not already subscribed to the show on your favorite podcast player, + +01:20:27 What are you waiting for? + +01:20:29 Just search for Python in your podcast player. + +01:20:30 We should be right at the top. + +01:20:32 If you enjoyed that geeky rap song, you can download the full track. + +01:20:35 The link is actually in your podcast player show notes. + +01:20:37 This is your host, Michael Kennedy. + +01:20:39 Thank you so much for listening. + +01:20:40 I really appreciate it. + +01:20:42 I'll see you next time. + +01:20:55 Thank you. + diff --git a/transcripts/531-talk-python-in-prod.vtt b/transcripts/531-talk-python-in-prod.vtt new file mode 100644 index 0000000..f89f290 --- /dev/null +++ b/transcripts/531-talk-python-in-prod.vtt @@ -0,0 +1,4358 @@ +WEBVTT + +00:00:00.020 --> 00:00:02.700 +Have you ever thought about getting your small product into production, + +00:00:02.920 --> 00:00:05.160 +but are worried about the cost of the big cloud providers? + +00:00:05.750 --> 00:00:08.640 +Or maybe you think your current cloud service is over architected + +00:00:08.690 --> 00:00:09.820 +and costing you too much? + +00:00:10.380 --> 00:00:13.940 +Well, in this episode, we interview Michael Kennedy, author of Talk Python + +00:00:14.130 --> 00:00:17.600 +in Production, a new book that guides you through deploying web apps + +00:00:17.750 --> 00:00:19.680 +at scale with right sized engineering. + +00:00:20.480 --> 00:00:25.400 +This is Talk Python To Me, episode 531, recorded November 26, 2025. + +00:00:44.420 --> 00:00:47.900 +Welcome to Talk Python To Me, a weekly podcast on Python. + +00:00:48.380 --> 00:00:50.140 +This is your guest host, Christopher Trudeau. + +00:00:50.780 --> 00:00:53.260 +Follow me on bluesky, where I'm trudeau.dev. + +00:00:53.900 --> 00:00:59.020 +You can follow the podcast or this week's guest on Mastodon, @talkpython for the show + +00:00:59.360 --> 00:01:02.980 +and @mkennedy for the guest, both on fosstodon.org. + +00:01:03.380 --> 00:01:08.580 +And keep up with the show and listen to over nine years of episodes at talkpython.fm. + +00:01:09.180 --> 00:01:13.540 +If you want to be part of our live episodes, you can find the live streams over on YouTube, + +00:01:14.040 --> 00:01:20.160 +subscribe to our YouTube channel at talkpython.fm/youtube and get notified about upcoming shows. + +00:01:21.700 --> 00:01:24.280 +Look into the future and see bugs before they make it to production. + +00:01:25.080 --> 00:01:30.360 +Sentry's Seer AI code review uses historical error and performance information at Sentry + +00:01:30.690 --> 00:01:34.700 +to find and flag bugs in your PRs before you even start to review them. + +00:01:35.380 --> 00:01:37.200 +Stop bugs before they enter your code base. + +00:01:37.700 --> 00:01:41.580 +Get started at talkpython.fm/seer-code-review. + +00:01:42.120 --> 00:01:44.080 +And it's brought to you by Agency. + +00:01:44.720 --> 00:01:46.500 +Discover agentic AI with Agency. + +00:01:47.000 --> 00:01:49.760 +Their layer lets agents find, connect, and work together. + +00:01:50.100 --> 00:01:51.080 +Any stack, anywhere. + +00:01:51.740 --> 00:01:57.680 +Start building the internet of agents at talkpython.fm/agency spelled A-G-N-T-C-Y. + +00:01:58.420 --> 00:02:00.240 +Michael, welcome to Talk Python To Me. + +00:02:00.600 --> 00:02:01.440 +You know, I looked it up. + +00:02:01.680 --> 00:02:03.220 +You've been on the show more than anyone else. + +00:02:03.800 --> 00:02:06.340 +But in case there's new listeners, tell us a bit about yourself. + +00:02:06.840 --> 00:02:07.240 +Incredible. + +00:02:07.700 --> 00:02:09.000 +Good to be here with you, Christopher. + +00:02:09.320 --> 00:02:12.820 +A bit of a turn of the tables, I would say. + +00:02:13.300 --> 00:02:18.180 +And it's, you know, long time listeners, I'm sure they know all the little details because + +00:02:18.400 --> 00:02:19.860 +I work them in here and there. + +00:02:20.080 --> 00:02:21.900 +I think that's kind of fun to just make things personal. + +00:02:22.120 --> 00:02:24.280 +But I've also said this on the show, + +00:02:24.540 --> 00:02:27.560 +and I'm sure it's a surprising fact that you know as well, + +00:02:27.740 --> 00:02:30.100 +but over half of the people in the Python space + +00:02:30.320 --> 00:02:31.500 +have only been here for two years. + +00:02:31.840 --> 00:02:33.500 +Yeah, I keep seeing that stat. + +00:02:33.620 --> 00:02:33.760 +Yep. + +00:02:34.140 --> 00:02:34.720 +That's crazy, right? + +00:02:34.780 --> 00:02:36.400 +So even if people, you know, + +00:02:36.480 --> 00:02:39.260 +I told this story about my background five years ago, + +00:02:39.440 --> 00:02:41.220 +like those people weren't here, like half of them. + +00:02:41.480 --> 00:02:42.900 +So crazy, crazy stuff. + +00:02:43.480 --> 00:02:45.840 +All right, so my backstory, I was, + +00:02:47.120 --> 00:02:48.420 +I thought I would be a mathematician. + +00:02:48.840 --> 00:02:49.620 +I studied math. + +00:02:49.980 --> 00:02:52.680 +stuff like that in college, was working on my PhD, + +00:02:53.440 --> 00:02:56.660 +started doing a bunch of work with silicon graphics, + +00:02:57.120 --> 00:02:58.740 +mainframe, supercomputer type stuff + +00:02:59.160 --> 00:03:00.380 +so that I could do my math research. + +00:03:00.540 --> 00:03:02.300 +And I realized, wow, this programming stuff + +00:03:02.300 --> 00:03:03.520 +is way more fun than math. + +00:03:04.500 --> 00:03:05.420 +How do I change gears? + +00:03:05.660 --> 00:03:09.000 +And so that was like 1998, 99. + +00:03:09.540 --> 00:03:10.360 +Haven't looked back. + +00:03:10.500 --> 00:03:13.220 +I've been programming since then and super fun, + +00:03:13.320 --> 00:03:16.460 +a couple of languages and around 10, 11 years ago, + +00:03:16.800 --> 00:03:19.660 +started Talk Python, year after that, quit my job. + +00:03:19.960 --> 00:03:21.960 +Made Talk Python my full-time job. + +00:03:22.480 --> 00:03:23.820 +Started offering courses as well. + +00:03:23.980 --> 00:03:25.720 +That's something that people don't necessarily know. + +00:03:26.080 --> 00:03:28.720 +That sometimes they'll ask, well, what do you do for your job, actually? + +00:03:28.760 --> 00:03:29.980 +I'm like, well, we're doing it. + +00:03:31.240 --> 00:03:33.440 +So anyway, that's me. + +00:03:34.400 --> 00:03:35.920 +A super, super fan of Python. + +00:03:36.520 --> 00:03:37.520 +Super fan of programming. + +00:03:38.020 --> 00:03:40.340 +Every day I wake up just like, wow, how awesome is today? + +00:03:40.620 --> 00:03:43.420 +Look at all the new stuff that came out that we learned that we can do. + +00:03:43.680 --> 00:03:46.380 +Like new libraries, AI stuff these days. + +00:03:47.060 --> 00:03:48.860 +Yeah, there's always plenty to talk about. + +00:03:49.140 --> 00:03:50.040 +It's incredible times. + +00:03:50.420 --> 00:03:51.160 +It's incredible times. + +00:03:51.600 --> 00:03:57.200 +And you've added a new trophy to the mantle, I guess. + +00:03:57.340 --> 00:03:58.260 +You've written a book. + +00:03:58.880 --> 00:04:00.000 +I have written a book. + +00:04:00.540 --> 00:04:02.900 +You know, that's a little bit, I'll put it this way. + +00:04:02.960 --> 00:04:07.160 +It's not something I ever saw myself doing, but I'm really excited that I did. + +00:04:07.580 --> 00:04:12.220 +And yeah, it took, I spent a couple of months properly writing it. + +00:04:12.400 --> 00:04:16.760 +You know, I really put in my energy in and like all projects, you think you're about done. + +00:04:18.680 --> 00:04:22.440 +Yeah, that first 80% is nothing like the last 80%. + +00:04:22.560 --> 00:04:25.000 +No, and the last 5% is long. + +00:04:25.840 --> 00:04:28.280 +You know, and it's not just the book. + +00:04:28.440 --> 00:04:30.280 +It's like, okay, well, where am I going to sell it? + +00:04:30.460 --> 00:04:34.100 +Okay, well, Amazon, and then I'll self-publish, or do I use a publisher? + +00:04:34.540 --> 00:04:37.020 +You end up self-publishing it, but then you're like, how do I? + +00:04:37.260 --> 00:04:39.140 +You know, all these things you learn for the first time. + +00:04:39.220 --> 00:04:41.740 +Like, how do I get it into Amazon to sell even? + +00:04:42.040 --> 00:04:43.240 +And there's a bunch of decisions. + +00:04:44.080 --> 00:04:47.920 +I can tell you, even with having taken the publishing route, it's no easier. + +00:04:48.160 --> 00:04:52.080 +It's just that it goes dark for two months and then all of a sudden it's like, you need + +00:04:52.080 --> 00:04:53.020 +to do this by yesterday. + +00:04:53.340 --> 00:04:57.440 +So yeah, it's not necessarily an advantage either way, I think. + +00:04:57.640 --> 00:04:57.700 +Yeah. + +00:04:57.960 --> 00:04:58.180 +Yeah. + +00:04:58.580 --> 00:04:58.660 +Yeah. + +00:04:58.720 --> 00:05:02.740 +I was really on the fence and I thought, look, let me just try this myself. + +00:05:03.260 --> 00:05:04.380 +I got a few podcast listeners. + +00:05:04.960 --> 00:05:06.800 +I can let them know about it. + +00:05:06.960 --> 00:05:07.940 +An audience helps. + +00:05:08.460 --> 00:05:11.500 +I honestly think it was probably the right choice for me. + +00:05:11.700 --> 00:05:17.300 +And for those who haven't come across it yet, do you want to give us the one paragraph version? + +00:05:17.460 --> 00:05:17.600 +Yeah. + +00:05:17.840 --> 00:05:20.220 +So the book is called Talk Python in Production. + +00:05:20.740 --> 00:05:21.800 +There are other books that are, + +00:05:22.100 --> 00:05:24.040 +I'm pretty sure one is called Python in Production + +00:05:24.320 --> 00:05:26.160 +or other things about how do you get your Python app + +00:05:26.390 --> 00:05:26.860 +into production. + +00:05:27.190 --> 00:05:29.000 +But this is Talk Python in Production + +00:05:29.210 --> 00:05:33.140 +because it's sort of a dual role storytelling vehicle. + +00:05:33.280 --> 00:05:35.320 +Obviously it's nonfiction, it's a technical book. + +00:05:35.620 --> 00:05:39.960 +But the idea was, let me tell not a generic story + +00:05:39.960 --> 00:05:42.660 +of how you might run some of your Python code in production, + +00:05:42.940 --> 00:05:45.340 +mostly APIs, web apps, databases, + +00:05:45.750 --> 00:05:46.680 +like that kind of stuff, right? + +00:05:46.920 --> 00:05:54.760 +not a general story of that but my story right i i've been on this journey of not complete noob + +00:05:55.440 --> 00:06:03.400 +but pretty significantly lost getting my original python app out into the world to pretty confident + +00:06:03.620 --> 00:06:11.760 +running stuff in i think a simpler than usual way in a good way right i one of the things i really + +00:06:11.480 --> 00:06:18.440 +liked about the book is it's not quite changing gears but you you do a nice mix of sort of the + +00:06:19.100 --> 00:06:24.980 +decision making process versus the here's exactly what i did um and so you get a little bit of both + +00:06:25.180 --> 00:06:30.620 +and and honestly the decision making process is something i find often isn't there in a lot of + +00:06:30.780 --> 00:06:36.100 +work uh you know your your standard blog post is always well and then and then add exactly this to + +00:06:36.120 --> 00:06:41.680 +exactly this file um but i think i really really sort of enjoyed the and and this is what i tried + +00:06:41.800 --> 00:06:47.120 +and this is this is why i changed and you're very kind of humble about it like it like a lot of + +00:06:47.420 --> 00:06:51.540 +folks who write this kind of content it's thou shalt do this and you're like this is the way + +00:06:51.680 --> 00:06:57.540 +i've told you yeah yeah this worked for me and uh yeah so so i really like the fact that you've + +00:06:57.640 --> 00:07:03.599 +you've kind of blended that in what what made you decide to do this like that what there so you said + +00:07:03.480 --> 00:07:08.160 +you didn't really have the itch to go down the path so uh what i'm not sure i didn't have the + +00:07:08.240 --> 00:07:13.880 +itch i just didn't think that it was something i was capable of ah i see okay you know what i mean + +00:07:14.340 --> 00:07:20.040 +not not that i didn't think if i literally took two years of my life and went into like a cabin + +00:07:20.420 --> 00:07:25.300 +the rose style or something i could come out with a book i'm pretty sure but given all the + +00:07:25.500 --> 00:07:30.520 +constraints of like i have a family and i gotta keep top python running like in that sense i didn't + +00:07:30.520 --> 00:07:31.960 +I didn't think I would be able to do it, but. + +00:07:32.040 --> 00:07:34.920 +- Yeah, it's perseverance more than anything else, I think. + +00:07:35.140 --> 00:07:36.220 +Yeah, yeah, for sure. + +00:07:36.940 --> 00:07:37.300 +- Exactly. + +00:07:38.070 --> 00:07:39.060 +So, yeah, go ahead. + +00:07:39.060 --> 00:07:39.480 +- Sorry, go ahead. + +00:07:39.700 --> 00:07:40.220 +No, no, go ahead. + +00:07:40.710 --> 00:07:42.880 +- You know, so why did I write it? + +00:07:43.780 --> 00:07:44.500 +Two reasons. + +00:07:44.740 --> 00:07:46.300 +One, I think it's an interesting story, + +00:07:46.780 --> 00:07:48.920 +and I thought people would enjoy hearing it, + +00:07:48.980 --> 00:07:50.800 +like the personal side that you mentioned a little bit. + +00:07:50.840 --> 00:07:52.160 +I thought people would appreciate that. + +00:07:52.500 --> 00:07:54.660 +And maybe more significantly, + +00:07:55.380 --> 00:07:58.179 +I feel like a lot of the advice out there + +00:07:58.200 --> 00:08:00.100 +in the tech space in general, + +00:08:00.740 --> 00:08:02.160 +but for now we're focused on like, + +00:08:02.170 --> 00:08:04.720 +how do I deploy my app sort of like Python plus DevOps + +00:08:05.120 --> 00:08:05.440 +type of thing. + +00:08:05.780 --> 00:08:07.640 +But I think a lot of the advice out there + +00:08:08.040 --> 00:08:11.320 +is a little bit of a flex in the sense that, + +00:08:12.120 --> 00:08:15.620 +oh, look at this, we're using these seven services + +00:08:15.940 --> 00:08:16.780 +and then this technology, + +00:08:17.040 --> 00:08:18.460 +and then we're auto scaling it with that. + +00:08:18.510 --> 00:08:20.880 +And then we have these logs pumped over to this other thing. + +00:08:21.100 --> 00:08:23.260 +You're like, whoa, okay, that's kind of cool. + +00:08:23.800 --> 00:08:26.379 +But a lot of people who just have an app + +00:08:26.400 --> 00:08:31.880 +they just want to go from like it works on my machine to look what I made they see that and go + +00:08:32.020 --> 00:08:38.039 +I can't do that you know what it's not for me it's just like I can't spend $500 on this + +00:08:38.460 --> 00:08:45.700 +infrastructure and I don't feel worthy if I don't have all you know like completely geo distributed + +00:08:46.200 --> 00:08:50.960 +redundant databases and like you don't need that you know what I mean and people keep asking me + +00:08:50.960 --> 00:08:55.839 +like hey Michael can you give me some advice I'm like well not that and finally I'm like let me + +00:08:55.800 --> 00:09:01.140 +just tell the story you know and so that was a big motivation you see it in industry a lot + +00:09:01.600 --> 00:09:05.880 +it's sometimes referred to as you know resume based architecture right like it's a do we need + +00:09:06.120 --> 00:09:12.780 +this or is this because i'm trying to learn it um and i think there's always that oh some of it's + +00:09:12.880 --> 00:09:19.180 +aspirational right i we will be netflix and so you know we need to be on every continent and all + +00:09:19.180 --> 00:09:23.859 +the rest of it and uh right right it's it's very aspirational it's like i'm going to build this app + +00:09:23.900 --> 00:09:26.460 +And the reason I'm building it is it's going to take off. + +00:09:26.460 --> 00:09:29.940 +And that day when the hockey stick hits, I'm ready. + +00:09:30.300 --> 00:09:30.480 +Yeah. + +00:09:30.700 --> 00:09:31.140 +You know what I mean? + +00:09:31.540 --> 00:09:39.900 +There's also a, you know, I think there's a lack of recognition of the cost of both in + +00:09:40.060 --> 00:09:42.740 +energy and money, energy as in human effort. + +00:09:42.880 --> 00:09:47.020 +I'm not talking about electricity, of the next 1%, right? + +00:09:47.100 --> 00:09:51.820 +So like getting from 90% uptime to 95% uptime costs something. + +00:09:52.400 --> 00:09:58.660 +getting from 95 to 96 costs more than that and getting from 96 like and once you're getting into + +00:09:58.760 --> 00:10:03.780 +like the four nines five nines thing and then you know cloud flare goes down and you're all screwed + +00:10:03.980 --> 00:10:11.760 +anyways right so it's so it's so ironic it is 100 ironic that you take all these steps and you employ + +00:10:11.790 --> 00:10:16.320 +all these services and it's the complexity of those services that went down like yeah you know + +00:10:16.320 --> 00:10:18.060 +this show will come out in a couple of weeks, + +00:10:18.280 --> 00:10:21.960 +but we're just on the eve of basically three weeks + +00:10:22.060 --> 00:10:23.720 +of important things going down. + +00:10:24.220 --> 00:10:27.740 +First AWS, then Azure, and then GitHub. + +00:10:28.520 --> 00:10:30.240 +And also, and then Cloudflare, so let's put that as four, + +00:10:30.500 --> 00:10:32.480 +within three weeks, right? + +00:10:32.720 --> 00:10:36.360 +And the AWS one was like, the reason it went down + +00:10:36.620 --> 00:10:39.680 +is DynamoDB had some sort of DNS problem. + +00:10:40.380 --> 00:10:42.700 +Even if you're not using that, the thing you're using, + +00:10:42.900 --> 00:10:46.680 +Like Lambda depends upon DynamoDB for itself to work. + +00:10:46.680 --> 00:10:49.740 +So it was just like a cascade of kabang, right? + +00:10:49.940 --> 00:10:51.360 +And that's a little bit of this complexity. + +00:10:51.500 --> 00:10:53.380 +Like the more complexity you buy in, + +00:10:53.440 --> 00:10:55.960 +even if it's not yours, it is yours in a sense. + +00:10:56.080 --> 00:10:56.300 +Yeah, yeah. + +00:10:56.420 --> 00:10:58.260 +And there's always humans involved, right? + +00:10:58.420 --> 00:11:02.380 +So there's always fallibility somewhere, right? + +00:11:03.359 --> 00:11:05.820 +Although one of the arguments I have seen recently + +00:11:05.980 --> 00:11:07.620 +in response to the Cloudflare outages, + +00:11:08.220 --> 00:11:09.980 +the good news is if you're, you know, + +00:11:10.000 --> 00:11:11.240 +I saw some articles that were like, + +00:11:11.360 --> 00:11:13.120 +well, you shouldn't be dependent on Cloudflare. + +00:11:13.130 --> 00:11:14.780 +And I saw the counter articles were basically, + +00:11:15.400 --> 00:11:17.160 +you know what, when half the internet's down, + +00:11:17.310 --> 00:11:19.460 +no one's hassling you that your app is down + +00:11:19.670 --> 00:11:20.900 +because half the internet's down. + +00:11:21.010 --> 00:11:24.860 +So there is an excuse when it isn't your fault. + +00:11:25.080 --> 00:11:26.360 +So yeah, anyways. + +00:11:26.660 --> 00:11:27.200 +That is true. + +00:11:27.440 --> 00:11:31.280 +And you don't see what Cloudflare saved people. + +00:11:31.680 --> 00:11:31.780 +Yes. + +00:11:31.980 --> 00:11:32.160 +Right? + +00:11:32.600 --> 00:11:33.860 +I'm not using Cloudflare. + +00:11:34.010 --> 00:11:35.040 +I actually use bunny.net. + +00:11:35.220 --> 00:11:38.680 +But CDNs make it possible for your app + +00:11:38.730 --> 00:11:40.219 +to survive these spikes in ways + +00:11:40.240 --> 00:11:45.000 +that they very well may not with without and certainly the ddos type of stuff that they protect + +00:11:45.090 --> 00:11:52.640 +against well and i use it simply for certificates like google decided everyone shall be https even + +00:11:52.710 --> 00:11:58.860 +my sites that don't need it and rather than try to figure out automation for let's encrypt has + +00:11:58.910 --> 00:12:03.440 +gotten a lot better but when i first started it it was like and i need this and i need this and i + +00:12:03.540 --> 00:12:07.500 +need this and then the cron job could go down or it's like or i can stick cloud fire in front of it + +00:12:07.500 --> 00:12:09.600 +and I never have to think about it ever again, right? + +00:12:09.920 --> 00:12:12.460 +So yeah, there's a little bit of value that way. + +00:12:12.940 --> 00:12:13.960 +Yeah, there definitely, definitely is. + +00:12:14.460 --> 00:12:16.780 +Another thing I want to kind of bring back a little bit + +00:12:16.900 --> 00:12:19.180 +is that you opened this segment on. + +00:12:19.420 --> 00:12:22.520 +You said, like I shared the human side of the story + +00:12:22.670 --> 00:12:24.020 +in kind of a humble way. + +00:12:24.590 --> 00:12:25.980 +Like that was certainly something, + +00:12:26.070 --> 00:12:27.500 +that was one of the main goals, like I said. + +00:12:27.780 --> 00:12:30.560 +I think it's just a continuation of the podcast, right? + +00:12:30.680 --> 00:12:32.640 +I started the podcast 10 years ago and I'm like, + +00:12:33.320 --> 00:12:35.840 +when I got into Python, there were no Python podcasts. + +00:12:36.040 --> 00:12:40.140 +there had been but there were none at the time and i'm like there's all these cool libraries i want + +00:12:40.140 --> 00:12:45.440 +to hear the stories and the humanity and you go to the documentation and you're like cool + +00:12:45.600 --> 00:12:52.160 +technology sterile as can be and the reason technology is interesting is not just because + +00:12:52.380 --> 00:12:57.960 +there's an api i can call this but it's like it's a journey and it's a story and it's so i just + +00:12:58.010 --> 00:13:02.360 +wanted to do that again in the book it's all problem solving right and and there has to have + +00:13:02.200 --> 00:13:05.980 +have been a problem for someone to want to solve it, which means there's going to have been people + +00:13:06.220 --> 00:13:10.800 +involved in trying to figure out what that is. Yeah. I think a lot of people maybe, I don't know, + +00:13:10.940 --> 00:13:15.720 +I shouldn't speak for people. It seems to me though, like a lot of people, they look at a + +00:13:15.960 --> 00:13:22.660 +technology and they think, they just assess it as a dry, sterile sort of thing on its own. That was + +00:13:22.820 --> 00:13:30.560 +created in a context, right? Why was celery created? Not just so I can send events to it and + +00:13:30.560 --> 00:13:35.000 +like, you know, add more complexity and asynchronous, it solved a real problem. + +00:13:35.160 --> 00:13:39.600 +And if you, if you hear and you understand and you, you follow that journey, you're like, + +00:13:39.740 --> 00:13:42.420 +I see this is where this come from and why it exists. + +00:13:42.780 --> 00:13:44.380 +Then you can decide, is it for me? + +00:13:45.120 --> 00:13:45.180 +Right. + +00:13:45.620 --> 00:13:51.300 +Well, and, and I think doubly so in the, in the open source space, because like this is + +00:13:51.500 --> 00:13:52.440 +all volunteer work. + +00:13:52.880 --> 00:13:58.380 +And so knowing a little bit about who's doing what and, you know, humanizing that a little + +00:13:58.520 --> 00:13:58.680 +bit. + +00:13:59.560 --> 00:13:59.660 +Right. + +00:13:59.920 --> 00:14:00.380 +Their motivation. + +00:14:01.050 --> 00:14:01.180 +Yeah. + +00:14:01.390 --> 00:14:02.940 +And it's also easier to be grateful, right? + +00:14:03.120 --> 00:14:05.100 +Like this isn't some soulless corporate machine. + +00:14:05.290 --> 00:14:07.780 +There was a reason behind this and a driver behind it. + +00:14:09.420 --> 00:14:12.420 +This portion of Talk Python To Me is brought to you by Sentry. + +00:14:13.740 --> 00:14:14.600 +Let me ask you a question. + +00:14:15.400 --> 00:14:16.980 +What if you could see into the future? + +00:14:17.740 --> 00:14:19.000 +We're talking about Sentry, of course. + +00:14:19.320 --> 00:14:23.600 +So that means seeing potential errors, crashes, and bugs before they happen. + +00:14:24.040 --> 00:14:25.820 +Before you even accept them into your code base. + +00:14:26.480 --> 00:14:29.680 +That's what Sentry's AI Sears code review offers. + +00:14:30.400 --> 00:14:33.740 +You get error prediction based on real production history. + +00:14:34.270 --> 00:14:46.080 +AI Seer Code Review flags the most impactful errors your PR is likely to introduce before merge using your app's error and performance context, not just generic LLM pattern matching. + +00:14:46.920 --> 00:14:52.080 +Seer will then jump in on new PRs with feedback and warning if it finds any potential issues. + +00:14:52.780 --> 00:14:53.560 +Here's a real example. + +00:14:53.880 --> 00:15:01.580 +On a new PR related to a search feature in a web app, we see a comment from seer by sentry bot in the PR. + +00:15:02.280 --> 00:15:09.920 +And it says, potential bug, the process search results function can enter an infinite recursion when a search query finds no matches. + +00:15:10.700 --> 00:15:14.900 +As the recursive call lacks a return statement and a proper termination condition. + +00:15:15.480 --> 00:15:23.100 +And Seer AI code review also provides additional details which you can expand for further information on the issue and suggested fixes. + +00:15:23.760 --> 00:15:28.880 +And bam, just like that, Sear AI Code Review has stopped a bug in its tracks without any + +00:15:29.080 --> 00:15:29.600 +devs in the loop. + +00:15:30.020 --> 00:15:32.820 +A nasty infinite loop bug never made it into production. + +00:15:33.540 --> 00:15:34.300 +Here's how you set it up. + +00:15:34.720 --> 00:15:40.060 +You enable the GitHub Sentry integration on your Sentry account, enable Sear AI on your + +00:15:40.200 --> 00:15:44.940 +Sentry account, and on GitHub, you install the Sear by Sentry app and connect it to your + +00:15:45.060 --> 00:15:46.700 +repositories that you want it to validate. + +00:15:47.060 --> 00:15:49.900 +So jump over to Sentry and set up Code Review for yourself. + +00:15:50.400 --> 00:15:54.520 +Just visit talkpython.fm/seer-code-review. + +00:15:54.730 --> 00:15:57.560 +The link is in your podcast player show notes and on the episode page. + +00:15:58.060 --> 00:16:00.360 +Thank you to Sentry for supporting Talk Python and me. + +00:16:02.779 --> 00:16:08.220 +Inside the book, you've added a couple of things that are a little sort of non-standard, + +00:16:08.480 --> 00:16:10.680 +like the audio reader briefs and the galleries. + +00:16:11.340 --> 00:16:12.980 +You want to give a quick rundown? + +00:16:13.440 --> 00:16:18.180 +And speaking of motivation, what motivated you to include those things? + +00:16:18.440 --> 00:16:20.100 +- So let me describe what they are first, + +00:16:20.260 --> 00:16:20.880 +'cause they are weird, + +00:16:21.440 --> 00:16:23.020 +but they're weird in a good way, I think. + +00:16:23.460 --> 00:16:25.000 +So if you go to the book, + +00:16:26.019 --> 00:16:28.620 +my vision was somebody's gonna be reading this, + +00:16:29.000 --> 00:16:30.260 +very likely on a Kindle, + +00:16:30.720 --> 00:16:35.960 +and if I go and put really nice diagrams, pictures, whatever, + +00:16:36.500 --> 00:16:39.580 +how good is that gonna look in a Kindle paper white, + +00:16:39.760 --> 00:16:40.760 +black and white, you know what I mean? + +00:16:40.840 --> 00:16:42.440 +Like how hard is that going to be to read? + +00:16:42.520 --> 00:16:44.140 +I think it's gonna be hard is what I decided. + +00:16:44.540 --> 00:16:47.520 +And so what I ended up doing is I said, okay, + +00:16:48.300 --> 00:16:59.900 +How can I make it better for people so that when they want to work with code, it's not trapped inside your Kindle or your iPad, Apple Books or wherever you read it, but it's super accessible, right? + +00:17:00.220 --> 00:17:06.980 +So what I did is I created some things I called galleries, and there's a code gallery, a figure gallery, and a links gallery. + +00:17:07.400 --> 00:17:11.060 +And these are just like, they're kind of like an index of those things. + +00:17:11.079 --> 00:17:17.439 +So like the links one just says, hey, here's all the URLs that we talked about in chapter 10 or chapter 11. + +00:17:17.819 --> 00:17:19.480 +and just the sentence that contains them. + +00:17:19.720 --> 00:17:21.260 +So instead of trying to go back through + +00:17:21.500 --> 00:17:22.420 +and flipping through the book, + +00:17:22.480 --> 00:17:24.339 +like, where was that thing they talked about, right? + +00:17:24.459 --> 00:17:26.000 +Like, no, you just go to the gallery + +00:17:26.220 --> 00:17:27.060 +and you click on the chapter + +00:17:27.300 --> 00:17:28.620 +or you just do command F. + +00:17:29.040 --> 00:17:29.960 +There it is, you know what I mean? + +00:17:30.400 --> 00:17:32.500 +And also for, especially for the figures, + +00:17:32.760 --> 00:17:36.880 +like it has like 2,000 or 4,000 + +00:17:37.120 --> 00:17:38.820 +by whatever level pictures + +00:17:39.540 --> 00:17:41.500 +that you're not even allowed to put into + +00:17:42.100 --> 00:17:42.800 +like a Kindle book. + +00:17:42.920 --> 00:17:44.620 +They're like, no, we're going to redo those, + +00:17:45.100 --> 00:17:46.299 +rescale those images for you + +00:17:46.280 --> 00:17:47.560 +down to something fuzzy, right? + +00:17:47.660 --> 00:17:50.720 +So if you want to read like little tiny texts, I put it there. + +00:17:51.100 --> 00:17:51.780 +So that's the galleries. + +00:17:51.920 --> 00:17:56.060 +And I was just maybe a little bit more backstory here is when I wrote this, + +00:17:56.740 --> 00:17:59.680 +I've worked with other type of editing things, any tools. + +00:17:59.880 --> 00:18:04.280 +I'm just like, I need to write this and I need to get this done in a super fluid + +00:18:04.460 --> 00:18:04.600 +way. + +00:18:05.080 --> 00:18:06.560 +So I'm just going to write in Markdown. + +00:18:07.060 --> 00:18:07.220 +Right. + +00:18:07.660 --> 00:18:08.520 +Just writing in Markdown. + +00:18:09.000 --> 00:18:13.420 +And so what I did is I, of course there's book publishing things that you can put + +00:18:13.540 --> 00:18:14.620 +Markdown into and so on. + +00:18:14.840 --> 00:18:22.720 +But I'm like, I'm just going to write one markdown file per chapter and then write some Python program to assemble that in interesting ways. + +00:18:23.140 --> 00:18:23.260 +Right. + +00:18:23.500 --> 00:18:29.280 +And then turn that into an EPUB or PDF or Kindle or whatever through something called Pandoc. + +00:18:29.520 --> 00:18:30.320 +Are you familiar with Pandoc? + +00:18:30.680 --> 00:18:31.420 +I've heard of it. + +00:18:31.560 --> 00:18:31.640 +Yeah. + +00:18:31.940 --> 00:18:39.840 +If you go and look, for people who don't know what Pandoc is, if you go look at Pandoc, it has right on the web page, it has this like fuzzy thing on the right. + +00:18:40.300 --> 00:18:41.320 +It's like gray fuzzy. + +00:18:41.700 --> 00:18:42.640 +You know, what is that? + +00:18:43.100 --> 00:18:49.180 +this thing on the right shows you all the formats that go in and all the formats that could come out + +00:18:49.500 --> 00:18:54.960 +and it's it's insane like you can't even the line is just the lines connecting the graphs of these + +00:18:55.140 --> 00:19:02.060 +things it's just a black blob like i could put a haddock uh a haddock document whatever that is + +00:19:02.180 --> 00:19:07.700 +it convert that to a doc book four right i mean it's insane okay so what i did is i built this + +00:19:07.660 --> 00:19:13.040 +Python, simple Python thing that reassembles Markdown in certain ways and then feed the + +00:19:13.240 --> 00:19:16.480 +final thing that it built into Pandoc to get the different ebook formats. + +00:19:16.800 --> 00:19:17.360 +Right, right. + +00:19:17.600 --> 00:19:17.720 +Okay. + +00:19:17.880 --> 00:19:22.940 +But then it occurred to me, like, so I didn't start out with these gallery type things or + +00:19:22.980 --> 00:19:26.540 +other stuff, but I'm like, well, this is just Python against Markdown. + +00:19:27.140 --> 00:19:30.860 +Surely I can start pulling out all the links and pulling out the images and then writing + +00:19:30.860 --> 00:19:32.960 +them somewhere else and then just committing that to GitHub. + +00:19:33.480 --> 00:19:37.820 +So once, you know, it's kind of just the standard story of Python or programming in general, + +00:19:38.040 --> 00:19:39.900 +but I think it's extra common in Python. + +00:19:40.100 --> 00:19:42.220 +It's like, I started solving the problem with Python. + +00:19:42.380 --> 00:19:47.120 +And once that was in place, it's like the next block and the next thing is just like, + +00:19:47.320 --> 00:19:47.840 +that's easy now. + +00:19:47.900 --> 00:19:48.480 +And that's easy. + +00:19:48.540 --> 00:19:49.960 +And these three things are also easy. + +00:19:50.120 --> 00:19:51.940 +Let's just do that and just keep adding to it. + +00:19:52.120 --> 00:19:56.780 +So that's where they came from is one, wanting to make sure people had a good experience + +00:19:57.320 --> 00:20:00.220 +with like code resources, pictures, and so on. + +00:20:00.440 --> 00:20:04.300 +But also it's just kind of following the lead of like, hey, let's just keep going. + +00:20:04.620 --> 00:20:06.700 +Well, and it's one of the beauties of an ebook. + +00:20:08.260 --> 00:20:11.060 +If dead tree copies, those things cost money. + +00:20:11.440 --> 00:20:15.520 +And so it's like, oh, I've got a great idea for six more appendices. + +00:20:15.900 --> 00:20:17.900 +And that's when you start going, oh, wait a second. + +00:20:18.040 --> 00:20:20.520 +I'm not going to add 300 pages to a 200 page book. + +00:20:21.180 --> 00:20:21.720 +Yeah, exactly. + +00:20:22.240 --> 00:20:28.140 +With an ebook, you can go, oh, yeah, here, we can make this referenceable in a couple + +00:20:28.280 --> 00:20:28.660 +different ways. + +00:20:29.080 --> 00:20:29.200 +Right. + +00:20:29.280 --> 00:20:29.400 +Yeah. + +00:20:29.560 --> 00:20:35.060 +It's like it duplicates the images into, you know, maybe 20 more pages or something, but it's an ebook. + +00:20:35.160 --> 00:20:35.500 +Who cares? + +00:20:36.020 --> 00:20:36.140 +Exactly. + +00:20:36.700 --> 00:20:36.820 +Yeah. + +00:20:36.960 --> 00:20:37.100 +Yeah. + +00:20:37.540 --> 00:20:44.020 +So, you know, over the history of the show, I think I've become familiar with your AI journey. + +00:20:45.080 --> 00:20:48.180 +And recently, it sounds like you've bought. + +00:20:48.300 --> 00:20:50.720 +It's your fairly big proponent. + +00:20:51.380 --> 00:20:54.740 +That being said, there's still a Made by Humans logo inside. + +00:20:55.580 --> 00:20:55.820 +Yeah. + +00:20:57.840 --> 00:21:00.980 +So I'm going to put you on the spot. + +00:21:01.800 --> 00:21:03.300 +Do you believe in it or not? + +00:21:04.100 --> 00:21:05.120 +Do I believe in it or not? + +00:21:06.760 --> 00:21:08.840 +So why made by humans? + +00:21:09.360 --> 00:21:10.560 +Yeah, it's a really good question. + +00:21:11.080 --> 00:21:26.180 +So I think there's a weariness of content generated by AI or assisted by AI meant to attract attention and build authoritative feelings about a thing. + +00:21:26.360 --> 00:21:30.580 +when that authority or that skill set is not necessarily yours. + +00:21:31.120 --> 00:21:34.080 +And that I still very much do not like. + +00:21:34.800 --> 00:21:37.880 +If I wanted to create a blog, I mean, I guess I could do it. + +00:21:37.880 --> 00:21:40.500 +It'd be a fun experiment, I guess, in sort of an evil way. + +00:21:40.530 --> 00:21:43.040 +Like, what if I just go create a completely random blog + +00:21:43.190 --> 00:21:47.620 +and I just have chat just bust out an article twice a day, every day, + +00:21:48.040 --> 00:21:50.280 +of the thing that's on the top of Hacker News or something? + +00:21:50.290 --> 00:21:52.200 +You know, just like, you could do that. + +00:21:52.490 --> 00:21:54.600 +And actually, it might even be interesting. + +00:21:54.710 --> 00:21:55.900 +I don't really even know. + +00:21:56.420 --> 00:22:01.000 +but I don't want it. I don't want that. Right. And for this, I wanted to share my story, + +00:22:01.180 --> 00:22:06.980 +not have AI create a book that has bullet points of my story. Right. Right. Yeah. So for me, + +00:22:07.000 --> 00:22:12.760 +it was important to like write this. I wrote it a hundred percent by Michael, right. It took me + +00:22:13.460 --> 00:22:19.360 +a lot of work. People's, I know like it got posted on like Reddit and I think hacker news somewhere. + +00:22:19.400 --> 00:22:25.880 +There was a bunch of comp and they're like, Oh, this thing is definitely just AI generated. I'm + +00:22:25.900 --> 00:22:32.500 +felt not AI generated to me. If it makes you feel any better, I've had actually comments pop up on + +00:22:32.530 --> 00:22:39.400 +some of my video courses claiming that my voiceover was AI. So that's just the world we live in now. + +00:22:39.420 --> 00:22:43.820 +It's the world we live in and there's not a lot you can do with it. So just kind of put a little + +00:22:43.980 --> 00:22:49.960 +bit of a pushback against that. I did put like a prefix sort of thing and a label that says + +00:22:50.300 --> 00:22:55.160 +made by humans. And you know, what's really funny is I don't know if I can actually find that section. + +00:22:55.300 --> 00:22:59.400 +I don't think I can on just the web part, but I made a picture. + +00:22:59.900 --> 00:23:00.320 +Maybe I did. + +00:23:00.780 --> 00:23:00.980 +Humans? + +00:23:01.210 --> 00:23:01.320 +No. + +00:23:01.640 --> 00:23:03.620 +Anyway, I made a picture that I drew. + +00:23:03.710 --> 00:23:07.340 +I literally, I'm a big fan of Pixelmator Pro. + +00:23:07.450 --> 00:23:09.400 +I went into Pixelmator Pro and I drew it. + +00:23:09.720 --> 00:23:11.840 +And they said, proof that this is AI generated. + +00:23:12.080 --> 00:23:14.080 +Look at that stupid made by humans graphic. + +00:23:14.300 --> 00:23:15.240 +It's clearly AI generated. + +00:23:15.600 --> 00:23:17.420 +It would be way better if it wasn't generative. + +00:23:17.720 --> 00:23:17.900 +Yes. + +00:23:19.620 --> 00:23:19.780 +Okay. + +00:23:20.160 --> 00:23:24.340 +So how do I square that with me actually being quite a fan of AI stuff these days? + +00:23:24.620 --> 00:23:28.640 +Like I'm, let's do like a looking back and then looking forward. + +00:23:28.760 --> 00:23:30.020 +So let's go back 30 years. + +00:23:30.520 --> 00:23:38.500 +I'm also a fan of using editors that when I hit dot, tell me what I can do with that function, class, variable, et cetera. + +00:23:38.960 --> 00:23:42.140 +So I'm not constantly in the documentation, right? + +00:23:42.520 --> 00:23:44.300 +Does that make me not a programmer? + +00:23:44.840 --> 00:23:45.480 +I don't think so. + +00:23:45.560 --> 00:23:46.380 +I'm still writing code. + +00:23:46.420 --> 00:23:47.460 +I'm still thinking architecture. + +00:23:47.560 --> 00:23:49.860 +I'm just not in the documentation constantly. + +00:23:50.060 --> 00:23:56.780 +And honestly, I maybe don't memorize every little function and field of everything in the standard library, right? + +00:23:57.060 --> 00:23:57.400 +It's fine. + +00:23:57.540 --> 00:23:59.360 +That's not where our time is best spent. + +00:23:59.800 --> 00:24:01.780 +And I feel that way about AI programming. + +00:24:01.880 --> 00:24:05.940 +I think there's a lot of, there are pitfalls and there are worrisome aspects of it. + +00:24:06.400 --> 00:24:12.620 +But you can use some of these agentic AI tools these days to think in so much higher level building blocks. + +00:24:13.440 --> 00:24:16.800 +Think of like, I'm working in a function and I'm writing this part. + +00:24:16.940 --> 00:24:19.300 +or I'm working in design patterns. + +00:24:19.440 --> 00:24:21.600 +And I can think of these big sort of concepts. + +00:24:22.160 --> 00:24:23.760 +Well, with this AI stuff, you can just say, + +00:24:24.200 --> 00:24:25.360 +what if we had a login page? + +00:24:25.840 --> 00:24:26.840 +Oh, we have a login page. + +00:24:27.080 --> 00:24:29.220 +Now, what other building block do I need? + +00:24:29.320 --> 00:24:31.360 +Like the building blocks are so big + +00:24:31.820 --> 00:24:34.760 +and sort of non-critical software, + +00:24:35.180 --> 00:24:37.180 +non-super important software becomes + +00:24:38.140 --> 00:24:39.760 +so much cheaper than before. + +00:24:40.360 --> 00:24:42.480 +You're like, I wish I had a little utility or app + +00:24:42.520 --> 00:24:43.220 +that would do this, + +00:24:43.420 --> 00:24:45.620 +but it just definitely doesn't justify + +00:24:45.640 --> 00:24:50.120 +two weeks of work to have it. Like, what if it was a couple of prompts and half an hour? Like, + +00:24:50.130 --> 00:24:54.340 +yeah, well then I'll have it. You know what I mean? And you can, that is transformative + +00:24:55.590 --> 00:24:59.680 +for how we work. Yeah. So much of coding is boilerplate, right? So if we can figure out how + +00:24:59.710 --> 00:25:05.640 +to make that easier, then why not? Right. And I haven't got there with it myself. I don't know + +00:25:05.790 --> 00:25:11.160 +whether I will. I'm definitely a little more suspicious of it than you are, but I copy and + +00:25:11.180 --> 00:25:12.360 +paste code all the time. + +00:25:12.550 --> 00:25:16.400 +And it's not like I'm like, oh, I have to hand tune that. + +00:25:16.510 --> 00:25:18.760 +No, it's like, well, I got to copy something that does that. + +00:25:18.840 --> 00:25:21.120 +- Yeah, let me give you a concrete example + +00:25:21.370 --> 00:25:24.320 +because I think it's easy to talk in generalizations + +00:25:24.450 --> 00:25:25.820 +and people are like, well, that's not for me, + +00:25:25.820 --> 00:25:27.520 +a bunch of AI slop, which is fair. + +00:25:28.040 --> 00:25:29.840 +But I'll give you an example of one thing I'm like, + +00:25:29.880 --> 00:25:33.280 +this was just such a nuisance and I'm gonna fix it. + +00:25:33.440 --> 00:25:37.080 +So when I first built Talk Python, the website 10 years ago, + +00:25:37.540 --> 00:25:40.000 +Bootstrap was all the rage, not modern Bootstrap, + +00:25:40.120 --> 00:25:41.500 +like old Bootstrap, right? + +00:25:41.600 --> 00:25:43.560 +Which they've completely redone the way + +00:25:43.760 --> 00:25:46.460 +that you have to structure your HTML and CSS + +00:25:48.260 --> 00:25:51.060 +completely, incompatibly, several times since then. + +00:25:51.420 --> 00:25:53.500 +And until very recently, until this summer, + +00:25:54.060 --> 00:25:56.040 +every time I wanted to add a feature or an aspect, + +00:25:56.320 --> 00:25:58.540 +like for example, this whole section that hosts the book, + +00:25:59.180 --> 00:26:01.080 +I wanna add that, well, you know what I had to do? + +00:26:01.100 --> 00:26:03.200 +I had to go write like 10-year-old Bootstrap. + +00:26:03.320 --> 00:26:04.620 +And I'm like, I hate it so much. + +00:26:04.920 --> 00:26:07.140 +There's so much nicer tools I could be doing this. + +00:26:07.740 --> 00:26:11.920 +but there's 8,000 lines of HTML and almost as many CSS. + +00:26:12.780 --> 00:26:15.940 +Does it justify me rewriting that much UI + +00:26:16.720 --> 00:26:18.660 +so that I can feel a little bit better + +00:26:18.700 --> 00:26:20.340 +and use a little bit nicer UI tooling? + +00:26:20.880 --> 00:26:22.440 +No, it definitely doesn't. + +00:26:22.960 --> 00:26:24.960 +And so for a couple of years, I'm like, + +00:26:25.080 --> 00:26:27.000 +oh, I wish I could do something else, + +00:26:27.220 --> 00:26:28.140 +but it's not worth it. + +00:26:28.360 --> 00:26:29.700 +And then I was sitting on my porch, + +00:26:30.100 --> 00:26:31.780 +little back area with my computer in the summer, + +00:26:31.980 --> 00:26:32.780 +hanging out, I'm like, you know what? + +00:26:32.780 --> 00:26:33.800 +I bet Claude could do it. + +00:26:34.080 --> 00:26:36.740 +Hey, Claude, rewrite all of this site. + +00:26:37.140 --> 00:26:43.140 +make a plan and rewrite it, move it from Bootstrap 3 to Bulma, which is like simple tailwind, + +00:26:43.580 --> 00:26:48.640 +and just completely do it. Four hours later, the whole thing was redone, like 20,000 lines of edits. + +00:26:49.100 --> 00:26:49.320 +Wow. + +00:26:49.770 --> 00:26:53.460 +Done. And it wasn't perfect. I had to go, you messed up a little bit here. And actually, + +00:26:53.980 --> 00:26:57.100 +that was right, but that doesn't look good anymore. So could you actually just make it look, + +00:26:57.170 --> 00:27:00.440 +you know what I mean? But I mean, it was like half a day. That work was done. + +00:27:00.710 --> 00:27:00.800 +Right. + +00:27:00.960 --> 00:27:03.080 +And that is a different level. + +00:27:04.160 --> 00:27:06.940 +No, it would have been weeks to a month. + +00:27:07.000 --> 00:27:08.620 +And it's the worst kind. + +00:27:08.700 --> 00:27:11.220 +It's like, okay, here's how the grid system used to work. + +00:27:11.340 --> 00:27:12.680 +Let me restructure the HTML. + +00:27:12.860 --> 00:27:13.800 +Oh, you lost a div? + +00:27:14.280 --> 00:27:14.600 +Whoopsie. + +00:27:15.020 --> 00:27:16.440 +Now how are you going to untangle this? + +00:27:16.540 --> 00:27:16.840 +You know what I mean? + +00:27:16.900 --> 00:27:18.860 +Like really, really not good stuff. + +00:27:19.040 --> 00:27:20.600 +And you can just turn these tools on it. + +00:27:20.600 --> 00:27:22.500 +And I'm like, you know, love it or hate it. + +00:27:22.600 --> 00:27:27.320 +That is a skill and a power and a tool that is unlike things we've had before. + +00:27:27.480 --> 00:27:31.160 +And so when I started having some of those kinds of experiences, I'm like, all right, + +00:27:31.200 --> 00:27:32.240 +I need to pay attention to this. + +00:27:32.680 --> 00:27:35.340 +I honestly think a lot of these AIs and LLMs, + +00:27:35.420 --> 00:27:38.980 +they're kind of copyright theft and other types of things. + +00:27:39.440 --> 00:27:41.500 +And there's the environmental aspect and all that. + +00:27:41.560 --> 00:27:43.620 +But the thing is out of the box. + +00:27:44.160 --> 00:27:45.740 +Pandora is on the loose, right? + +00:27:46.980 --> 00:27:49.420 +So given that you're putting your head in the sand, + +00:27:49.580 --> 00:27:50.740 +it's not going to make it go away. + +00:27:51.200 --> 00:27:51.920 +Should you use it or not? + +00:27:52.340 --> 00:27:53.500 +It's a very powerful tool. + +00:27:53.620 --> 00:27:55.380 +And so that is what I'm excited about, + +00:27:55.780 --> 00:27:58.120 +but I'm not excited about when I go to YouTube + +00:27:58.320 --> 00:28:00.720 +and I see a video and you can just tell + +00:28:00.720 --> 00:28:02.200 +that it's a voiceover plus some general, + +00:28:02.300 --> 00:28:04.540 +or I go to read a blog and you can tell that it's like, + +00:28:05.220 --> 00:28:07.600 +they didn't even put enough energy into like, + +00:28:08.080 --> 00:28:10.260 +they spent less time writing than I have to read it. + +00:28:10.360 --> 00:28:10.860 +That's not right. + +00:28:10.980 --> 00:28:11.900 +There's something going wrong here. + +00:28:12.020 --> 00:28:13.380 +- Well, as with all tools, + +00:28:13.500 --> 00:28:15.380 +we'll figure out what works and what doesn't work. + +00:28:16.260 --> 00:28:19.200 +Those 8,000 files that you're talking about though, + +00:28:19.360 --> 00:28:23.760 +those are 8,000 files you have and built over time. + +00:28:25.260 --> 00:28:27.520 +I suspect- - Lines, by the way, not files. + +00:28:27.700 --> 00:28:28.720 +- Lines, I'm sorry. + +00:28:28.720 --> 00:28:31.020 +- I think it's a couple hundred files probably. + +00:28:31.600 --> 00:28:36.700 +so, but, so that might be something that folks listening aren't really aware of, + +00:28:36.840 --> 00:28:36.940 +right? + +00:28:37.020 --> 00:28:42.400 +Like, you're not just the, you know, the podcasts and the courses, but you're + +00:28:42.640 --> 00:28:44.640 +the guy behind the engineering behind all of it. + +00:28:44.860 --> 00:28:46.940 +So why, you know, why do that? + +00:28:47.040 --> 00:28:51.340 +Why not, you know, Squarespace or something along those lines didn't exist when you came + +00:28:51.390 --> 00:28:52.260 +out, but you get the idea. + +00:28:52.400 --> 00:28:55.100 +Like what, what, what, why spin it up yourself? + +00:28:55.380 --> 00:28:56.740 +How did you, how did you get there? + +00:28:56.980 --> 00:28:57.940 +So it's a good question. + +00:28:58.820 --> 00:29:04.600 +When I started, there were places I could have gone and hosted the podcast. + +00:29:05.240 --> 00:29:08.040 +You know, they were very unappealing. + +00:29:08.660 --> 00:29:12.000 +Not in the sense, like, as a consumer of it, like, they put your show there. + +00:29:13.240 --> 00:29:14.740 +They were really ugly. + +00:29:15.160 --> 00:29:18.720 +And they would do things like, next to your show, here's some other shows you might like. + +00:29:18.840 --> 00:29:19.400 +You're like, no. + +00:29:20.280 --> 00:29:20.900 +No, I don't. + +00:29:21.240 --> 00:29:22.960 +I just got people to come to my page. + +00:29:23.020 --> 00:29:23.760 +You're sending them away. + +00:29:24.020 --> 00:29:24.740 +Like, don't do that. + +00:29:25.120 --> 00:29:25.240 +Right. + +00:29:25.340 --> 00:29:29.900 +But those sites are like Squarespace or whatever, and they're hosting a bunch of them. + +00:29:29.940 --> 00:29:32.660 +And so they want engagement on their platform broadly. + +00:29:33.220 --> 00:29:34.260 +They're not for you. + +00:29:34.600 --> 00:29:38.800 +So initially I thought, well, plus I don't have a ton of experience writing Python code. + +00:29:38.880 --> 00:29:42.620 +And if I'm going to do the podcast, the more I can get my hands on this stuff, get experience. + +00:29:43.200 --> 00:29:47.800 +So I just sat down and really in like three days, I wrote the Talk Python website. + +00:29:48.480 --> 00:29:49.920 +I'm like, I'm doing it this weekend. + +00:29:50.160 --> 00:29:50.480 +You know what I mean? + +00:29:50.600 --> 00:29:51.380 +I had a job at the time. + +00:29:51.500 --> 00:29:52.260 +So I'm like, I got to do it. + +00:29:52.300 --> 00:29:52.900 +It's a long weekend. + +00:29:53.140 --> 00:29:53.500 +We're doing it. + +00:29:53.800 --> 00:30:07.800 +And so I just sat there and cranked it out and really got a really good appreciation for building all the way to the end, you know, like not 60% or following a demo, but like, no, here's a fully working app and all the little nuances. + +00:30:08.060 --> 00:30:15.580 +But then honestly, that's like the genesis of this, the story that is the book is, well, now how do I get it out there? + +00:30:15.940 --> 00:30:16.580 +I built it. + +00:30:16.620 --> 00:30:17.020 +It's fine. + +00:30:17.020 --> 00:30:18.380 +It works great here locally. + +00:30:19.260 --> 00:30:21.220 +Now, like, where do I take it? + +00:30:21.240 --> 00:30:21.380 +Right. + +00:30:21.520 --> 00:30:24.900 +And a lot of places said, well, you just fire up your Linux terminal and your SSH. + +00:30:24.900 --> 00:30:26.680 +And I'm like, these words are making me very nervous. + +00:30:27.320 --> 00:30:28.360 +I need them to not do that. + +00:30:28.400 --> 00:30:29.900 +I need you to stop saying that. + +00:30:31.660 --> 00:30:34.420 +Don't forget to swing the chicken over your head. + +00:30:35.560 --> 00:30:35.920 +Exactly. + +00:30:37.140 --> 00:30:44.480 +So I actually started in Python Anywhere, even before Anaconda owned them, which was the + +00:30:45.040 --> 00:30:47.000 +selling point was you go to your browser. + +00:30:47.660 --> 00:30:53.920 +You, I think you give it a get URL, or maybe, maybe you go into a terminally do a get plot. + +00:30:53.970 --> 00:30:57.160 +I can't remember how it worked 10 years ago, but it was basically you go to the webpage, + +00:30:57.580 --> 00:31:02.060 +you type in your domain, you get a terminal, which is basically an SSH connection, but + +00:31:02.230 --> 00:31:05.480 +in the terminal, and then you give it some commands and then they manage it for you. + +00:31:05.880 --> 00:31:07.740 +And I'm like, okay, I don't really have to know any Linux. + +00:31:08.070 --> 00:31:10.080 +I just have to do the two things that says in the terminal. + +00:31:10.640 --> 00:31:16.860 +And then they keep it running and they, they do the SSH key, the SSH certificates, the DNS, + +00:31:17.220 --> 00:31:21.260 +that i'm like this i can do this and i got it going there and i was really proud and i ran + +00:31:21.640 --> 00:31:27.520 +i ran talk python on basically python anywhere and sqlite for like six months first six months + +00:31:27.860 --> 00:31:33.760 +but then it occurred to me that python anywhere is not really intended to host production level + +00:31:33.980 --> 00:31:39.280 +applications and it occurred to me when i got an email from them one day again this is pre-anaconda + +00:31:39.660 --> 00:31:46.280 +and what it said was we're going to be down for four hours as we do some maintenance that's not + +00:31:46.180 --> 00:31:50.840 +going to be the best look for my podcast which is just now starting to gain some traction and getting + +00:31:50.840 --> 00:31:54.480 +a lot of people talking on social media and saying hey there's a new podcast you should check + +00:31:54.620 --> 00:31:59.480 +it out i'm like the four hours are not making me psyched like i understand that things might have + +00:31:59.480 --> 00:32:05.980 +to reboot they might be down for 30 seconds but hours and hours seems a little like this is not + +00:32:06.100 --> 00:32:10.000 +really what they intend this for it's like for hobbyists to put up a thing and i probably don't + +00:32:10.080 --> 00:32:16.140 +belong there anymore and once you shifted why not aws or azure or something like that so + +00:32:16.160 --> 00:32:19.240 +So I looked around and I went to DigitalOcean. + +00:32:19.940 --> 00:32:23.180 +So I'd done stuff with both Azure and AWS. + +00:32:23.540 --> 00:32:25.840 +A little part that I left out about this web journey + +00:32:25.840 --> 00:32:29.460 +is I had actually run some pretty big websites + +00:32:30.220 --> 00:32:31.740 +in both AWS and Azure, + +00:32:32.260 --> 00:32:36.140 +but they were Windows-based.NET type things, right? + +00:32:36.360 --> 00:32:40.300 +So literally GUI sort of configuration hosting them + +00:32:40.780 --> 00:32:42.140 +or platform as a service. + +00:32:42.840 --> 00:32:45.420 +And I don't know, I looked at both of them, + +00:32:45.540 --> 00:32:46.340 +especially Azure at the time, + +00:32:46.340 --> 00:32:47.860 +I'm like, "Well, this is complicated," + +00:32:47.860 --> 00:32:49.160 +and like unnecessarily so, + +00:32:49.250 --> 00:32:52.520 +and I'm afraid I'm trading one level of complexity + +00:32:52.650 --> 00:32:54.840 +for another, and also really expensive, + +00:32:55.260 --> 00:32:56.680 +like no joke expensive. + +00:32:57.240 --> 00:33:00.320 +The podcast, like in terms of people viewing the pages, + +00:33:00.760 --> 00:33:03.440 +is nothing insane, I mean, it's certainly popular, + +00:33:03.620 --> 00:33:05.400 +but it's nothing like, how are you gonna handle that? + +00:33:05.500 --> 00:33:07.800 +But the amount of traffic the podcasts and courses do + +00:33:07.810 --> 00:33:11.700 +in terms of video and MP3s and even XML, + +00:33:12.140 --> 00:33:14.900 +Like I think Hawk Python ships about a gigabyte of, + +00:33:15.410 --> 00:33:16.920 +no, a terabyte of XML. + +00:33:17.320 --> 00:33:18.740 +Think about a terabyte of XML every month. + +00:33:18.750 --> 00:33:22.180 +Like it's basically a distributed denial, a DDoS, + +00:33:22.720 --> 00:33:24.460 +and a welcome DDoS attack, + +00:33:25.080 --> 00:33:27.620 +because you'll think how many podcast players are out there + +00:33:27.690 --> 00:33:28.880 +going, got a new one, got a new one, + +00:33:28.920 --> 00:33:30.000 +got a new one, got a new one. + +00:33:30.400 --> 00:33:35.140 +And each one of those requests is like a meg of XML or more. + +00:33:35.380 --> 00:33:36.260 +You know, it's like, got a new one, + +00:33:36.340 --> 00:33:42.100 +got a new one + +00:33:42.340 --> 00:33:47.340 +certainly the courses and my bill was well over a thousand bucks and just bandwidth right and then + +00:33:47.340 --> 00:33:54.720 +I looked at DigitalOcean and I'm like oh you mean it all that bandwidth is free it's included you + +00:33:54.880 --> 00:34:00.980 +get like terabytes of free traffic with DigitalOcean or Hetzner or some of these smaller ones and I'm + +00:34:01.020 --> 00:34:05.780 +just like yeah this is better like I don't know what I don't know what this stuff for is here + +00:34:05.780 --> 00:34:10.639 +where they charge you know a hundred dollars a terabyte yeah just to ship stuff around it's crazy + +00:34:10.659 --> 00:34:16.480 +I have found, particularly professionally, it's like, oh, you're going to charge me for how many DNS lookups there are. + +00:34:16.879 --> 00:34:19.100 +That doesn't seem like something I can predict. + +00:34:20.580 --> 00:34:21.679 +Yes, I know. + +00:34:22.480 --> 00:34:28.379 +And quite frankly, if it reaches a certain level, I need you to turn it off because something's gone horribly awry. + +00:34:31.179 --> 00:34:33.679 +This portion of Talk Python To Me is brought to you by Agency. + +00:34:34.210 --> 00:34:40.340 +The Agency, spelled A-G-N-T-C-Y, is an open source collective building the Internet of Agents. + +00:34:40.379 --> 00:34:46.260 +We're all very familiar with AI and LLMs these days, but if you have not yet experienced the + +00:34:46.510 --> 00:34:53.040 +massive leap that agentic AI brings, you're in for a treat. Agentic AI takes LLMs from the world's + +00:34:53.220 --> 00:34:59.000 +smartest search engines to truly collaborative software. That's where agency comes in. Agency is + +00:34:59.030 --> 00:35:04.380 +a collaboration layer where AI agents can discover, connect, and work across frameworks. + +00:35:05.120 --> 00:35:10.960 +For developers, this means standardized agent discovery tools, seamless protocols for interagent + +00:35:11.180 --> 00:35:15.960 +communication, and modular components to compose and scale multi-agent workflows. + +00:35:16.780 --> 00:35:21.380 +Agency allows AI agents to discover each other and work together regardless of how they're + +00:35:21.600 --> 00:35:23.540 +built, who built them, or where they run. + +00:35:24.460 --> 00:35:28.760 +And they just announced several key updates, including interoperability with Anthropics, + +00:35:29.060 --> 00:35:35.780 +Model Context Protocol, MCP, a new observability data schema enriched with concepts specific to + +00:35:35.850 --> 00:35:42.160 +multi-agent systems, as well as new extensions to the OpenAgentic Schema Framework, OASF. + +00:35:43.060 --> 00:35:47.320 +So are you ready to build the internet of agents? Get started with Agency and join + +00:35:47.700 --> 00:35:53.999 +Crew AI, LangChain, Llama Index, BrowserBase, Cisco, and dozens more. Visit talkpython.fm + +00:35:54.040 --> 00:35:55.880 +slash agency to get started today. + +00:35:56.300 --> 00:35:58.280 +That's talkpython.fm/agency. + +00:35:58.820 --> 00:36:00.720 +The link is in your podcast player's show notes + +00:36:01.040 --> 00:36:02.220 +and on the episode page. + +00:36:02.800 --> 00:36:05.240 +Thank you to the agency for supporting Talk Python and me. + +00:36:06.660 --> 00:36:09.100 +There's certainly areas where the cloud can go like sideways. + +00:36:10.160 --> 00:36:13.380 +In the book, I mentioned a story about Kara, I believe. + +00:36:13.940 --> 00:36:17.220 +And that was this project that this woman, + +00:36:17.270 --> 00:36:19.380 +I think in Hong Kong or South Korea, + +00:36:19.780 --> 00:36:21.120 +I'm afraid I can't remember which, + +00:36:21.560 --> 00:36:22.220 +I think it's South Korea. + +00:36:22.380 --> 00:36:23.280 +Anyway, created this. + +00:36:23.500 --> 00:36:28.640 +she's a photographer and really hates ai generated art so created this service that would say hey + +00:36:29.010 --> 00:36:35.120 +give me a piece of art and i'll tell you if it's ai generated or not or something vaguely like this + +00:36:35.460 --> 00:36:40.900 +and her thing took off in the app store and was like number six and her cloud bill at versell was + +00:36:41.580 --> 00:36:46.860 +ninety six thousand dollars in a week right as not as a business just as a human who built something + +00:36:47.060 --> 00:36:52.440 +fun as a side project like oh my yeah and in fairness to a lot of those tools they tend to + +00:36:52.380 --> 00:36:54.760 +have ways of saying, please limit this and do that. + +00:36:55.020 --> 00:36:55.200 +Yeah. + +00:36:55.900 --> 00:36:57.080 +They're not the default. + +00:36:57.550 --> 00:37:03.800 +And if you're not thinking about that problem and protecting yourself, you know, you'd, you'd + +00:37:03.940 --> 00:37:05.800 +kind of hope that it would be the other way around. + +00:37:06.060 --> 00:37:08.700 +It should be, yeah, you know, here's your cap. + +00:37:08.880 --> 00:37:11.240 +And if you want more than that, you need to do something about it. + +00:37:11.700 --> 00:37:12.520 +Yeah, absolutely. + +00:37:12.790 --> 00:37:17.580 +And to their fairness, they did send her a message saying your bill is going way higher + +00:37:17.630 --> 00:37:18.480 +than you might expect. + +00:37:18.940 --> 00:37:22.020 +And she didn't look at her email or something, but still. + +00:37:22.220 --> 00:37:27.540 +And so one of the things that really appeals to me is when you choose something like a + +00:37:27.700 --> 00:37:30.960 +Hetzner or a DigitalOcean or something, and you say, I'm going to pay for the server. + +00:37:31.220 --> 00:37:34.320 +You're like, okay, that's $40 I'm committing to. + +00:37:34.600 --> 00:37:36.500 +Maybe double that, you know, whatever. + +00:37:37.000 --> 00:37:40.020 +But the bandwidth is basically free or it's included, right? + +00:37:40.200 --> 00:37:41.500 +But for 40 bucks, it feels free. + +00:37:42.020 --> 00:37:44.800 +And then it's only going to cost as much as it costs. + +00:37:44.820 --> 00:37:47.520 +You might have to go, oh my gosh, it's too much traffic. + +00:37:47.620 --> 00:37:48.560 +We're going to have to deal with it. + +00:37:48.680 --> 00:37:52.800 +but it's the upper bound of those systems these days is so high. + +00:37:53.210 --> 00:37:59.700 +It is so high that we, you know, back to that aspirational thing that you mentioned, right? + +00:38:00.040 --> 00:38:04.600 +Like the chances that you blow past what a $50 server can handle, + +00:38:05.160 --> 00:38:10.140 +you're going to be really, really popular with the SaaS or something. + +00:38:10.420 --> 00:38:13.300 +You were joking earlier about SQLite. + +00:38:13.520 --> 00:38:14.900 +It's gotten so much better as well. + +00:38:14.930 --> 00:38:17.020 +And I'm not saying it's the answer to everything, + +00:38:17.780 --> 00:38:21.660 +But I could probably come pretty close to running your site now too. + +00:38:21.860 --> 00:38:27.260 +Like it's scary between the processor improvements and the improvements in the software. + +00:38:27.520 --> 00:38:30.660 +It's made a big difference for that kind of stuff. + +00:38:30.700 --> 00:38:36.360 +It takes very, very little hardware to handle something that's pretty, pretty impressive. + +00:38:36.740 --> 00:38:36.940 +Yeah. + +00:38:38.160 --> 00:38:41.240 +So I think the title of chapter four is one of my favorites. + +00:38:41.860 --> 00:38:45.180 +It gives a little hint as to what approach you took. + +00:38:45.400 --> 00:38:47.400 +The title is Docker, Docker, Docker. + +00:38:48.020 --> 00:38:53.260 +so what approach did you take you know what i was really not wanting to do docker on genuinely i mean + +00:38:53.400 --> 00:38:59.680 +that and so what i did when i originally switched over to some vms i'm like okay the story i'm told + +00:38:59.680 --> 00:39:05.360 +of what the cloud is you know i bought the aw the ec2 story well we've got all this extra capacity + +00:39:05.860 --> 00:39:11.620 +oh instead of getting like really expensive heavy metal you know big metal sort of servers you get + +00:39:11.620 --> 00:39:15.719 +a bunch of small ones and they're kind of like cheap and you just make them throw them away + +00:39:15.740 --> 00:39:20.000 +whatever right so I went and made a bunch of small servers in in DigitalOcean I + +00:39:20.060 --> 00:39:22.760 +think I had eight servers at one point and I thought this is gonna give me lots + +00:39:22.760 --> 00:39:25.940 +of isolation if I got to work on this one thing it won't mess with that and + +00:39:25.940 --> 00:39:30.560 +what I realized is they're interconnected enough like that really I + +00:39:30.640 --> 00:39:34.260 +end up just having to reboot eight servers in an orchestrated way than + +00:39:34.300 --> 00:39:38.180 +managing I'm like this is just worse I gotta patch eight servers instead of one + +00:39:38.260 --> 00:39:42.599 +now because this is not better so how do I end up with docker docker docker I + +00:39:42.680 --> 00:39:48.860 +I realized that it would be better to just have one server and basically stepping back just a little bit. + +00:39:48.870 --> 00:40:04.120 +Like, what if you could completely isolate yourself from almost all the complexity of the cloud and all of their services and all that junk and just say, I have a place where I can run apps that's got enough capacity that I can just keep throwing more apps in there if I want. + +00:40:04.560 --> 00:40:10.520 +And it doesn't have any tie in with the specific APIs and services of a particular provider. + +00:40:10.960 --> 00:40:14.940 +So I said, well, what if I just get a big server and I just run all my apps in there? + +00:40:15.400 --> 00:40:17.320 +And if I want a database, I put the database there. + +00:40:17.320 --> 00:40:21.680 +If I want like a Valkey caching, I can put a Valkey caching and things like that. + +00:40:22.000 --> 00:40:27.300 +And that's sort of as much autonomy as I can exert on running something in the cloud. + +00:40:27.580 --> 00:40:30.740 +It's almost like I went and got a big machine and stuck it in my closet. + +00:40:31.120 --> 00:40:36.060 +But that's insane because you get million dollar networking equipment and, you know, failover. + +00:40:36.560 --> 00:40:41.620 +But that doesn't mean you have to go fully pass managed database, this other service. + +00:40:41.810 --> 00:40:46.680 +Like you could just say, just give me a Linux machine where I can then go do my, do my, + +00:40:46.960 --> 00:40:51.920 +my hosting and all my apps and let them party and talk to each other and stuff in there. + +00:40:52.020 --> 00:40:52.180 +Right. + +00:40:52.500 --> 00:40:56.600 +So then I thought, well, I don't have all these little servers for isolation. + +00:40:57.110 --> 00:41:02.240 +I'm not really sure I want to throw all this random stuff together, like completely just + +00:41:02.740 --> 00:41:04.940 +in the same soup in that one big server. + +00:41:05.420 --> 00:41:07.540 +And by the way, the big server right now that it's running + +00:41:07.640 --> 00:41:10.600 +has eight cores, 16 gigs of RAM, and costs $30. + +00:41:11.020 --> 00:41:11.240 +Right. + +00:41:11.620 --> 00:41:13.760 +It comes with two terabytes of four terabytes traffic, + +00:41:13.890 --> 00:41:14.300 +something like that. + +00:41:14.640 --> 00:41:14.780 +Lots. + +00:41:15.180 --> 00:41:17.580 +$400 of included bandwidth for $30. + +00:41:18.100 --> 00:41:19.480 +So I said, well, what if I took that over? + +00:41:19.940 --> 00:41:24.200 +I think autonomy is a big motivator of this whole journey as well. + +00:41:24.360 --> 00:41:27.180 +Like, I don't want to be tied into all these different things. + +00:41:27.250 --> 00:41:30.180 +I just want a space where I have reliable compute and networking + +00:41:30.440 --> 00:41:32.000 +and Linux, and I can just do whatever. + +00:41:32.220 --> 00:41:33.660 +So then I said, all right, well, + +00:41:33.940 --> 00:41:37.780 +I better figure out some of the stuff with Docker just so that there is some + +00:41:38.100 --> 00:41:40.920 +isolation of all the different pieces living in the same space. + +00:41:41.360 --> 00:41:45.520 +So I forced myself to learn Docker and what it occurred to me was, Oh, + +00:41:45.980 --> 00:41:50.320 +Docker is just writing down in a file what I would normally have to type in the + +00:41:50.480 --> 00:41:52.100 +terminal to make the thing happen. + +00:41:52.580 --> 00:41:58.420 +Except for I put run in front of the command or copy instead of CP and I + +00:41:58.540 --> 00:42:02.680 +get a repeatability and, and, and someone else has packaged a bunch of this stuff. + +00:42:02.800 --> 00:42:04.120 +so you don't have to do it yourself. + +00:42:04.790 --> 00:42:04.900 +Exactly. + +00:42:05.210 --> 00:42:05.900 +And I'm like, okay, + +00:42:06.360 --> 00:42:08.280 +I don't know what all my concern was about + +00:42:08.460 --> 00:42:10.200 +because it's not much more complicated. + +00:42:10.880 --> 00:42:13.120 +One of my concerns was sort of monitorability. + +00:42:13.300 --> 00:42:15.240 +Like if I just go there + +00:42:15.270 --> 00:42:16.720 +and I just create a bunch of virtual environments + +00:42:16.830 --> 00:42:17.680 +and run my code, + +00:42:17.760 --> 00:42:19.520 +I can actually go and see the source code. + +00:42:19.630 --> 00:42:20.780 +I can see the config files. + +00:42:20.930 --> 00:42:22.640 +I can see where the logs are being written + +00:42:22.860 --> 00:42:25.480 +and I can sort of manage it through SSH. + +00:42:25.840 --> 00:42:26.180 +And I thought, well, + +00:42:26.210 --> 00:42:28.640 +if I put a bunch of disconnected Docker things together, + +00:42:28.800 --> 00:42:29.640 +that's going to be challenging. + +00:42:29.730 --> 00:42:31.420 +And I realized actually, not really. + +00:42:32.100 --> 00:42:38.360 +like if you set it up the same you could still tail all the logs and you can you can ssh into + +00:42:38.840 --> 00:42:43.920 +the containers if you really have to you know look at something running inside them like what is the + +00:42:44.240 --> 00:42:48.260 +process actually i don't know what does it do how much memory is it using relative to other stuff + +00:42:48.500 --> 00:42:53.400 +but and i also talk about a bunch of tools for monitoring them yeah so how has that changed over + +00:42:53.560 --> 00:42:59.219 +time like you started with some fairly bare bones you've got you've got some extra tools what what's + +00:42:59.240 --> 00:43:01.540 +What does that evolution look like? + +00:43:01.840 --> 00:43:06.580 +Well, I used to rely more on whatever the cloud provider, + +00:43:07.140 --> 00:43:08.640 +DigitalOcean or Hetzner, offered. + +00:43:09.000 --> 00:43:11.220 +You know, they always have like a metrics section. + +00:43:11.320 --> 00:43:13.040 +So I can go see, well, what's the CPU doing? + +00:43:13.500 --> 00:43:16.240 +What's the memory looking like over time? + +00:43:16.820 --> 00:43:18.480 +And that works okay. + +00:43:19.140 --> 00:43:21.320 +And if you've got one app that you're running there, + +00:43:21.380 --> 00:43:23.740 +you're like, okay, well, that must be how much memory the app is using. + +00:43:24.120 --> 00:43:27.540 +But right now, if I go to Talk Python and I ask, + +00:43:27.840 --> 00:43:30.420 +I think there are 27 different containers running, + +00:43:31.240 --> 00:43:33.980 +which you can't ask how's the server doing. + +00:43:34.030 --> 00:43:34.880 +I know very much. + +00:43:35.060 --> 00:43:36.560 +You know, it really matters much more. + +00:43:36.640 --> 00:43:37.760 +Well, it's busy. + +00:43:37.770 --> 00:43:38.160 +I get it. + +00:43:38.170 --> 00:43:39.580 +But which one is the problem? + +00:43:39.800 --> 00:43:40.600 +Which one is busy? + +00:43:40.720 --> 00:43:41.480 +Which one's using all of them? + +00:43:41.860 --> 00:43:43.640 +So I started to look around + +00:43:43.970 --> 00:43:45.320 +and there's actually a bunch of recommendations + +00:43:45.800 --> 00:43:47.400 +that I have for the book. + +00:43:47.940 --> 00:43:49.000 +So one of them, + +00:43:49.310 --> 00:43:51.220 +the first one I used was this thing called Glances + +00:43:51.430 --> 00:43:52.740 +and Glances is excellent. + +00:43:52.930 --> 00:43:53.520 +And by the way, + +00:43:53.960 --> 00:43:56.580 +Glances, the way they talk about often getting it, + +00:43:56.700 --> 00:43:59.260 +I think, where do they talk about installing it here? + +00:43:59.370 --> 00:44:02.220 +Probably is often like apt install glances + +00:44:02.680 --> 00:44:03.800 +or something like that, right? + +00:44:04.120 --> 00:44:06.620 +But a lot of these tools even have Docker versions. + +00:44:07.180 --> 00:44:09.640 +If you share the volumes and sockets just right, + +00:44:09.920 --> 00:44:11.140 +they function just the same. + +00:44:11.350 --> 00:44:14.880 +So you could say Docker run glances XYZ + +00:44:15.340 --> 00:44:16.380 +and it doesn't even install, + +00:44:16.520 --> 00:44:18.560 +it doesn't even touch your one big server + +00:44:18.780 --> 00:44:20.720 +that is kind of like your building space. + +00:44:20.830 --> 00:44:22.380 +So it leaves it a little more pure, right? + +00:44:22.700 --> 00:44:23.820 +So glances is super cool. + +00:44:23.920 --> 00:44:33.600 +And what it does is it gives you this really nice dashboard of what's going on with your app, like your server. + +00:44:33.670 --> 00:44:34.720 +How much memory is being used? + +00:44:34.950 --> 00:44:36.220 +How much CPU is being used? + +00:44:36.320 --> 00:44:37.400 +How has that been over time? + +00:44:37.570 --> 00:44:40.480 +Has there been like extended spikes and so on? + +00:44:40.680 --> 00:44:46.300 +And one of the things that's new to Glances, and I don't think it's in this picture that's on their home screen. + +00:44:46.600 --> 00:44:47.880 +I'm pretty sure it is not. + +00:44:48.070 --> 00:44:48.500 +Oh, no, it is. + +00:44:48.840 --> 00:44:50.720 +It just, mine is inverted because I have so many. + +00:44:51.060 --> 00:44:52.480 +It has a container section. + +00:44:52.580 --> 00:44:59.820 +So when you run it, it actually shows you not just the processes, but also gives you a performance, memory, IO, etc. + +00:45:00.320 --> 00:45:03.580 +for all the running containers and their uptimes and those kind of things. + +00:45:04.020 --> 00:45:05.360 +So this is super cool. + +00:45:05.900 --> 00:45:10.860 +So you construct a certain Docker command and then you have this running dashboard that just goes. + +00:45:11.320 --> 00:45:13.700 +So this is the first thing that I started with and I really like that. + +00:45:14.140 --> 00:45:16.880 +But then I also found something called BTOP. + +00:45:16.920 --> 00:45:17.980 +Are you familiar with BTOP? + +00:45:18.240 --> 00:45:19.220 +No, I haven't used this one. + +00:45:19.320 --> 00:45:20.660 +Oh my gosh, BTOP is incredible. + +00:45:21.020 --> 00:45:21.980 +This is so good. + +00:45:22.020 --> 00:45:24.120 +it's really something. + +00:45:25.280 --> 00:45:25.860 +Zoom in on it. + +00:45:26.000 --> 00:45:29.820 +So this gives you moving graphs over time + +00:45:30.160 --> 00:45:31.360 +of all sorts of things. + +00:45:31.940 --> 00:45:34.400 +It shows you graphs of network, + +00:45:34.780 --> 00:45:36.000 +inbound, outbound traffic. + +00:45:36.150 --> 00:45:37.740 +It shows you the CPUs. + +00:45:37.740 --> 00:45:38.900 +It gives you a breakdown of like, + +00:45:38.960 --> 00:45:40.800 +here's all the different distributed work + +00:45:41.020 --> 00:45:43.540 +across your eight CPU cores and over total. + +00:45:44.400 --> 00:45:45.700 +It's really something else. + +00:45:45.830 --> 00:45:47.500 +And so this one is really nice. + +00:45:47.500 --> 00:45:50.359 +You can configure the UI a lot to show + +00:45:50.380 --> 00:45:53.320 +and zoom in on disk activity or whatever. + +00:45:53.800 --> 00:45:55.020 +This is really a nice way to view it. + +00:45:55.020 --> 00:45:57.200 +And again, when you run all these Docker containers, + +00:45:57.340 --> 00:45:59.980 +they feel like they're super isolated and tucked away. + +00:46:00.420 --> 00:46:01.820 +And from their perspective, they are. + +00:46:02.060 --> 00:46:04.040 +But when you look in the process list here, + +00:46:04.380 --> 00:46:06.480 +it just shows the process that Docker is running. + +00:46:06.860 --> 00:46:09.360 +So I have all my web apps and APIs and stuff + +00:46:09.640 --> 00:46:10.860 +setting a process name. + +00:46:11.260 --> 00:46:13.440 +So instead of just Python, Python, Python, Python, Python, + +00:46:13.860 --> 00:46:16.520 +it'll say like, talk Python, Granian Worker 1, + +00:46:16.740 --> 00:46:17.940 +talk Python, Granian Worker 2. + +00:46:19.400 --> 00:46:21.820 +Versus indexing service daemon. + +00:46:22.170 --> 00:46:26.140 +And then when you look into any of these tools, you can see, oh, exactly what is busy. + +00:46:26.190 --> 00:46:30.840 +And those are actually the names inside of Docker, but they still surface exactly like + +00:46:31.000 --> 00:46:31.920 +that to all these tools. + +00:46:32.880 --> 00:46:37.680 +One of the things you said kind of hit home for me, like it was subtle and it kind of + +00:46:37.750 --> 00:46:40.940 +moved on, which was like, if you interconnect it correctly, right? + +00:46:41.040 --> 00:46:44.840 +Like if you get the files and sockets going, this goes smoothly. + +00:46:45.820 --> 00:46:49.360 +And I think it's one of the things you've done very, very well in the book is sort of + +00:46:49.380 --> 00:46:52.600 +through that, like, as you talk about the different Docker configurations, like, okay, + +00:46:53.030 --> 00:46:57.120 +well, this is why we're putting this here rather than in the container, this is going to be shared. + +00:46:57.740 --> 00:47:02.680 +And, you know, and, and there's, and the reason for this, I assume some of that was experimental. + +00:47:03.110 --> 00:47:07.440 +You just sort of over time, you kind of went, oh, okay, wait, I need that somewhere else. + +00:47:08.000 --> 00:47:08.160 +Yeah. + +00:47:08.360 --> 00:47:14.060 +Or, or was it, you know, did, was there somebody's knowledge that you, depended + +00:47:14.110 --> 00:47:14.700 +on a lot there? + +00:47:14.780 --> 00:47:15.720 +How did you get there? + +00:47:16.040 --> 00:47:17.580 +How, how organic was the journey? + +00:47:18.260 --> 00:47:24.560 +the, I would say half and half, like some of it, for example, the glances stuff, I just found + +00:47:25.020 --> 00:47:29.860 +when I went looking for it, that there was like ways to install. And I said, it just said, Oh, + +00:47:29.860 --> 00:47:33.900 +you could just install it by running this Docker thing. And it's like a big, long command. And I'm + +00:47:33.900 --> 00:47:39.020 +like, Oh, that's cool. Because it doesn't matter how long the command is, what I would do is I'll + +00:47:39.020 --> 00:47:46.079 +go into my dot ZS HRC and say alias glances equals paste. And then I saved that somewhere. And I + +00:47:46.040 --> 00:47:51.040 +never I couldn't tell you what it is at all I just know it has to like do a yeah it has a few things + +00:47:51.110 --> 00:47:57.220 +so it can like look back into the host to see you know what's running and so on um yeah so a lot of + +00:47:57.250 --> 00:48:02.920 +it was was like that and then some of it was definitely you know two whole days of like why + +00:48:03.180 --> 00:48:07.260 +won't that talk to that let me build it again let me do some more devote you know what I mean and + +00:48:07.420 --> 00:48:11.560 +eventually like okay all right but once you get it kind of dialed then it's once you get a little bit + +00:48:11.580 --> 00:48:16.320 +of it working it's a blueprint you just like again again again so so you seem to have taken a bit of + +00:48:16.400 --> 00:48:22.980 +a like a heavier weight approach here you're just it's it's everything in the kitchen sink that that + +00:48:23.120 --> 00:48:27.720 +implies that it's not the right amount but it's counter to some of the advice that's out there + +00:48:28.070 --> 00:48:34.700 +uh sometimes folks talk about you know wanting to have things as minimal as possible why why what + +00:48:34.860 --> 00:48:40.440 +you've done versus the other how how how are they wrong can we start a flame war on the internet you + +00:48:40.460 --> 00:48:44.940 +Let's do it. Let's see how many, Michael, you're wrong. I can get into the YouTube comments. + +00:48:45.150 --> 00:48:53.000 +Actually, please, that's not a challenge. So here's the deal. I want, especially at the + +00:48:53.180 --> 00:48:59.380 +beginning of this journey, when I was like, I want as much comfort and visibility as I can get + +00:48:59.670 --> 00:49:04.540 +in these containers and other areas. You know what I mean? And I wanted to make it as close to, + +00:49:04.600 --> 00:49:10.540 +if I just set up a VM and just, you know, uv, V, ENV, and just roll from there, right? So what I did + +00:49:10.640 --> 00:49:17.720 +is I said, okay, I could try to go for like the super slim Docker image, or I could just get like + +00:49:17.780 --> 00:49:23.300 +a basic Docker image, but then put, you know, I put like, oh, my Z shell and ZSH on it, right? + +00:49:23.680 --> 00:49:27.940 +Does that need it? No, you could use SH, but do you know what happens when you use SH and you go + +00:49:28.100 --> 00:49:32.700 +in there? It's a brand new start. It doesn't remember anything you've ever, any command you've + +00:49:32.720 --> 00:49:37.840 +ever run, it doesn't give you any help. You know, you hit tab, it doesn't tell you nothing, right? + +00:49:37.980 --> 00:49:42.580 +You're like, oh gosh. But if you use like, oh my Z shell, it'll show you, hey, what version of Python + +00:49:42.750 --> 00:49:48.000 +is your virtual environment activated in? And I can just hit command R and see, you know, filter all + +00:49:48.140 --> 00:49:53.400 +my history and I can hit get tab and it'll auto complete all the get commands that I forgot what + +00:49:53.400 --> 00:49:56.800 +I was supposed to use because I'm freaked out because the site is down and how do I fix this? + +00:49:56.940 --> 00:50:01.200 +I mean, I wouldn't actually be in the container for that, but a lot of times you're in there kind + +00:50:01.220 --> 00:50:05.260 +of exploring because you're like, it's been fine for six months, but I need to see something. + +00:50:05.590 --> 00:50:10.580 +And so in the book, at least in the beginning, I recommended to people that they install + +00:50:11.310 --> 00:50:15.340 +some of these tools that you might install into your own terminal to make you feel more comfortable. + +00:50:15.730 --> 00:50:20.180 +So that my assumption is you're kind of new to Docker. You're feeling a little uncomfortable. + +00:50:20.560 --> 00:50:27.000 +Like who cares if it's another hundred megs on your hard drive? You're not shipping your app + +00:50:27.040 --> 00:50:33.160 +to Docker Hub. You're not going to take your web app, probably. It's not a reusable component. + +00:50:33.740 --> 00:50:38.280 +You've got your source code and you want the thing to just run here. You're not shipping it. So + +00:50:38.500 --> 00:50:42.680 +whoever wants to run, you know, indeed.com can just Docker pull that and run it. Like it's, + +00:50:43.040 --> 00:50:48.760 +that's not what it is. And in that context, you're not so worried about the space. And there's a + +00:50:48.840 --> 00:50:53.579 +couple of tips that you can use for like really, really fast build times, right? So I mean, like + +00:50:53.560 --> 00:50:59.120 +container build times for me are like seconds, a few seconds, even though, you know, there's 250 + +00:50:59.720 --> 00:51:05.840 +pip install packages for Talk Python training, you know, build times, and build time is also + +00:51:06.100 --> 00:51:10.440 +installing Python, right? You could make these things fast, so it's not like a huge impedance, + +00:51:11.160 --> 00:51:16.900 +but I think for people who are new to it, having something other than just sh, not even bash, + +00:51:17.340 --> 00:51:21.820 +you're a lot better off. So that's what I promoted. And I think it's, you know, it kind of comes back + +00:51:21.840 --> 00:51:23.760 +to sort of the thesis of the book as well, right? + +00:51:23.920 --> 00:51:25.260 +Which, which is right for you. + +00:51:25.770 --> 00:51:30.360 +so, you know, if you are going to be running a thousand of these spread + +00:51:30.410 --> 00:51:34.260 +across a whole bunch of different cores, then yeah, if you optimize this, that + +00:51:34.290 --> 00:51:36.340 +might change your cost framework and everything else. + +00:51:37.000 --> 00:51:37.560 +Well, right. + +00:51:37.720 --> 00:51:44.460 +Or if I was, if you're 27, if you're 27 on eight CPUs works fine, then, you + +00:51:44.540 --> 00:51:45.640 +know, go for it. + +00:51:45.760 --> 00:51:48.060 +You know, why, why, why get in your own way? + +00:51:48.400 --> 00:51:51.800 +And that advice is not like, this is why I really emphasize the + +00:51:51.820 --> 00:51:53.120 +context sort of thing, right? + +00:51:53.460 --> 00:51:58.780 +Is this advice is bad if your goal is to ship a container to Docker hub so that people can + +00:51:58.880 --> 00:52:00.400 +self host your open source thing. + +00:52:00.940 --> 00:52:03.600 +You don't want that to have extra crap that they don't need. + +00:52:03.820 --> 00:52:08.240 +But when there's only one of them for your machine and you're building it and you're managing + +00:52:08.480 --> 00:52:11.540 +it, you know, make it as comfy and helpful as possible. + +00:52:12.420 --> 00:52:13.060 +That was my philosophy. + +00:52:13.460 --> 00:52:20.240 +The, the structure of your site is, it has a lot of different pieces to it using + +00:52:20.260 --> 00:52:25.580 +different technology. You spend some time talking about like static sites and using static sites for + +00:52:25.740 --> 00:52:33.000 +part of it versus, you know, Python applications and those kinds of parts. How did you end up here? + +00:52:33.140 --> 00:52:37.120 +Like oftentimes the answer when you're looking at this kind of stuff is, well, I need a CMS for + +00:52:37.300 --> 00:52:42.720 +everything. And then I will try to figure out how to square peg my round hole of a CMS or whatever. + +00:52:43.060 --> 00:52:48.340 +So how did you end up with a collection? Well, you know, like many things that start simple, + +00:52:48.740 --> 00:52:50.060 +And you're like, well, just one more thing. + +00:52:50.580 --> 00:52:55.360 +So I tried, I'd kept pretty much the same web framework across all my different sites thinking, + +00:52:55.750 --> 00:52:57.980 +okay, that's, I'm going to just pick one and go with it. + +00:52:57.980 --> 00:52:58.960 +I think a lot of people do that. + +00:52:59.100 --> 00:53:00.460 +You know, there's people who are like, I use Django. + +00:53:00.700 --> 00:53:02.180 +There's people, I use Flask and so on. + +00:53:02.540 --> 00:53:08.280 +And then just slowly over time, you're like, really, this is, this part is really a hassle. + +00:53:08.700 --> 00:53:14.500 +I'd be a lot better off if I made that part served through the CDN or why am I, you know, + +00:53:14.540 --> 00:53:18.920 +One of the things that I see a lot, and it doesn't drive me crazy, but I'm just like, + +00:53:19.060 --> 00:53:23.840 +yeah, it's probably not necessary, is a lot of people in technology X. + +00:53:24.010 --> 00:53:24.880 +For us, that's Python. + +00:53:25.130 --> 00:53:26.460 +It could be JavaScript. + +00:53:26.800 --> 00:53:28.160 +It could be.NET, whatever, right? + +00:53:28.540 --> 00:53:32.400 +People who work extensively in that and have a lot of their identity tied into that, like + +00:53:32.410 --> 00:53:33.120 +I do and others. + +00:53:34.140 --> 00:53:36.080 +Like, I'm a Python developer, right? + +00:53:36.300 --> 00:53:41.720 +So if I'm going to choose a tool, like, let's say, a CMS or a static site generator or something + +00:53:41.860 --> 00:53:43.400 +like that, I'm going to choose the Python one. + +00:53:43.740 --> 00:53:48.540 +I'm a Python person like okay but are there better options out there than the Python ones for what + +00:53:48.620 --> 00:53:52.620 +you're trying to do because are you going to extend this tool no then what do you care of + +00:53:52.620 --> 00:53:57.300 +what it's written in right your operating system is not written in Python it's written in yeah yeah + +00:53:57.580 --> 00:54:01.920 +yeah C or I'm not going to use this word processor because it's not written in Python + +00:54:02.660 --> 00:54:07.040 +exactly so I have to go back in no I need a new service like it doesn't you don't see it you don't + +00:54:07.040 --> 00:54:11.400 +have to work with it you don't care and so I ended up a little bit with the mish match of just trying + +00:54:11.420 --> 00:54:15.720 +to say like what are the best tools like for example for the blog and some of the other static + +00:54:16.020 --> 00:54:21.800 +elements i've used hugo which is written in go it's like okay i type the command the command hugo + +00:54:22.030 --> 00:54:26.320 +you know and it it does its thing i don't really care what it's written in the templating extension + +00:54:26.370 --> 00:54:30.980 +is a little bit annoying but um i kind of just went around and said okay well what what do i think + +00:54:31.080 --> 00:54:37.680 +would be the best fit to make to make my life easy not to reinforce my identity as this type of + +00:54:37.600 --> 00:54:45.320 +of developer or that type of developer you know yeah it's um the one of things you know i'll show + +00:54:45.330 --> 00:54:51.000 +my own stripes here and and you can defend your beloved flask if you like but having come from + +00:54:51.180 --> 00:54:57.080 +the django side uh some of the things that you've kind of learned organically here are forced on you + +00:54:57.090 --> 00:55:03.000 +in django um so when you so so when you when you like the instructions for putting together a + +00:55:03.020 --> 00:55:07.700 +production site are, and you will run this command and it will move all of your static content over + +00:55:07.900 --> 00:55:12.320 +here. So like your mental model, when you come from that side is, oh, my site is actually built + +00:55:12.500 --> 00:55:17.700 +of these, at least these two different things. And I think coming from Flask, you might've sort of + +00:55:18.160 --> 00:55:24.140 +that, that discovery might've been a little more organic. You might not have been forced into it + +00:55:24.260 --> 00:55:29.020 +immediately, but once you've come to that realization of, oh, wait, I have these pieces + +00:55:29.460 --> 00:55:32.600 +and I can use something like Nginx to tie it all together, + +00:55:33.020 --> 00:55:35.200 +that means, well, then I don't have to figure out + +00:55:35.580 --> 00:55:37.240 +how to use a CMS for this thing + +00:55:37.360 --> 00:55:38.960 +that's very unnatural for a CMS. + +00:55:39.160 --> 00:55:42.260 +I can just mount it under slash blog + +00:55:42.540 --> 00:55:43.720 +and it'll work fine, yeah. + +00:55:44.100 --> 00:55:46.120 +Yeah, Django is very powerful. + +00:55:46.180 --> 00:55:46.740 +It definitely is. + +00:55:46.740 --> 00:55:49.000 +And I actually talked a lot about that in the book, + +00:55:50.920 --> 00:55:52.300 +which evaluating web frameworks. + +00:55:52.840 --> 00:55:54.880 +But I would say before we, if we're going to that, + +00:55:54.880 --> 00:55:57.640 +but before we do, I think your point about using Nginx + +00:55:57.680 --> 00:55:58.600 +to piece them together, + +00:55:58.840 --> 00:56:03.420 +things together or caddy or traffic or whatever it doesn't matter like some front end web server + +00:56:03.560 --> 00:56:09.820 +they all do it yep yeah is so often people think i have this python app let's say i have a django + +00:56:10.040 --> 00:56:15.940 +app so i want to add a cms to it what could i possibly add is it static content well maybe what + +00:56:15.960 --> 00:56:19.640 +you should add is hugo i don't know i'm just making this up right like it might actually be a bad + +00:56:19.800 --> 00:56:24.480 +option but well hugo is not a python thing so how do i put it into my django app i mean they're very + +00:56:24.720 --> 00:56:28.800 +very different in the way they work so they don't really go that super well together if you were to + +00:56:28.820 --> 00:56:35.160 +like how does one literally source code wise go into another but if you just made like a hugo site + +00:56:35.240 --> 00:56:40.060 +or other static site however you make it and then put it on the same server and then in nginx when + +00:56:40.060 --> 00:56:46.320 +you say if you go to this domain slash docs it goes completely over here and if it goes anywhere + +00:56:46.410 --> 00:56:51.780 +else it goes just to django and all of a sudden from the outside it looks like a very cohesive + +00:56:52.290 --> 00:56:56.720 +single thing with just different sub urls but you get to choose the best technology for the static + +00:56:56.740 --> 00:57:02.420 +bits and you get to choose the best technology for your dynamic data-driven bits and that is all just + +00:57:02.600 --> 00:57:08.000 +done by configuring the front-end web server that you don't even have visibility to in python and i + +00:57:08.100 --> 00:57:12.900 +think that that's a big mental shift but it's like those kinds of things that bring a both the + +00:57:13.060 --> 00:57:16.600 +flexibility to make these choices and the simplicity to not try to jam them together + +00:57:17.540 --> 00:57:22.560 +there's a uh there's a a third-party library for django that i use once in a while which is called + +00:57:22.620 --> 00:57:29.260 +distill and it's a static site generator based on django so say you had your url was like books + +00:57:29.720 --> 00:57:35.340 +you know it's books slash one book slash two book slash three well you tell distill i want book and + +00:57:35.340 --> 00:57:42.040 +i want all the possible queries of this number and it will generate the results as static so even + +00:57:42.260 --> 00:57:47.720 +when you've got a dynamic site you can actually carve off the static portion and then have that + +00:57:47.800 --> 00:57:53.080 +fed straight out of nginx and if it isn't if there's no actual dynamic content on the page and + +00:57:53.140 --> 00:57:57.740 +if it only updates when the database updates or something like that and you can do it nightly like + +00:57:57.740 --> 00:58:02.540 +this gives you all sorts of other options and you know to come back to your my eight processor + +00:58:02.760 --> 00:58:08.720 +whatever's well the static sites are almost free you do not even need that you don't even need it + +00:58:09.040 --> 00:58:15.480 +it's nothing so like you can scale way down and have an absolutely mammoth site just by properly + +00:58:15.500 --> 00:58:20.620 +fine-tuning what's dynamic and what's static yeah you could go to millions and millions of requests + +00:58:20.780 --> 00:58:27.760 +if you just converted all that stuff to static and then put the x the extra resources css image + +00:58:28.320 --> 00:58:34.700 +javascript etc on a cdn yeah like i mean that is like almost web that's like web on easy mode right + +00:58:34.700 --> 00:58:41.600 +there because you it can't go down unless the server literally almost almost can't yeah i mean + +00:58:41.420 --> 00:58:45.360 +Let's not say we're in the eve of like, are we in the eve of a fifth going down? + +00:58:46.180 --> 00:58:50.900 +But I mean, like it can't go down because the code is wrong or there's a race condition + +00:58:51.120 --> 00:58:51.820 +or you're out of memory. + +00:58:52.270 --> 00:58:57.920 +Like it's really close to just if, if the, if the web server is up and you put a CDN + +00:58:57.920 --> 00:59:01.940 +in front of it and then it's not even necessarily that it's like the CDN has to go. + +00:59:02.020 --> 00:59:03.780 +You've got to have a cloud flare level incident. + +00:59:04.000 --> 00:59:04.080 +Yeah. + +00:59:04.580 --> 00:59:07.400 +And, and fully distributed in often cases. + +00:59:07.600 --> 00:59:07.660 +Right. + +00:59:07.760 --> 00:59:17.020 +So people in, you know, in continents, other than where you are based are getting fantastic load times because it's cached locally for them. + +00:59:17.580 --> 00:59:19.000 +Yeah, I just want to give a little shout out. + +00:59:19.280 --> 00:59:21.500 +I'm going to give a little shout out to Bunny.net. + +00:59:21.700 --> 00:59:29.320 +Like, I know people are all about Cloudflare, but this is a cool European company that focuses on privacy, has some really nice features. + +00:59:29.550 --> 00:59:30.260 +The pricing is great. + +00:59:30.430 --> 00:59:33.220 +And they have, you go here, go to the CDN. + +00:59:33.540 --> 00:59:40.840 +They've got somewhere way down here, you know, like 119 different places, including all over Africa. + +00:59:41.410 --> 00:59:44.620 +And this is just super, super cheap for traffic. + +00:59:45.280 --> 00:59:45.380 +Nice. + +00:59:45.920 --> 00:59:46.000 +Yeah. + +00:59:46.130 --> 00:59:52.400 +So I wasn't, earlier there, I wasn't intending to force a fistfight. + +00:59:52.720 --> 00:59:59.060 +And, you know, we're on the opposite side of the continent, so that would be a challenge when I, Jango versus Flash, go. + +01:00:00.040 --> 01:00:06.960 +But, you know, I think one of my favorite chapters was actually chapter 13, which was titled Picking a Python Web Framework. + +01:00:07.200 --> 01:00:09.860 +I really liked the nuance of this. + +01:00:11.980 --> 01:00:15.160 +It's unusual for folks to sort of reveal their reasoning. + +01:00:16.080 --> 01:00:26.100 +And honestly, I think, like, because I had no intention of tomorrow going and using Docker, the Docker chapters were interesting because I like to see how other people do things. + +01:00:26.640 --> 01:00:33.300 +But like I could grab the picking a Python web framework, pull that chapter out and hand it to almost any of my students. + +01:00:33.530 --> 01:00:37.400 +Right. Like it's this, you know, how do I make these kinds of decisions? + +01:00:37.590 --> 01:00:39.680 +How is this different? Why do I think about these things? + +01:00:40.280 --> 01:00:43.740 +And so often the content on this is really just religious war. + +01:00:44.620 --> 01:00:51.740 +And you've done like a really, really good job there of just sort of conveying this, you know, hey, here are pros and cons for each. + +01:00:51.820 --> 01:00:53.480 +And this is why I picked this. + +01:00:53.640 --> 01:00:58.000 +And so I really, really liked how you, how you covered that. + +01:00:58.520 --> 01:00:58.960 +Thank you so much. + +01:00:59.460 --> 01:01:05.560 +What you, I guess it's maybe the answer is obvious, but, but why? + +01:01:05.980 --> 01:01:07.300 +Like you were fine. + +01:01:07.350 --> 01:01:11.600 +You were just doing configuration file after configuration file and then a little bit editorial. + +01:01:12.580 --> 01:01:18.380 +What, what caused you, what, what was the emphasis for the impetus for spicing it up a little? + +01:01:18.440 --> 01:01:44.600 +Well, I mean, I think it's an important part of the journey is, is picking a technology to run your code on. So there's actually a couple of places that I kind of have that like I have that for I'm trying to create a term because I don't think we do a good job of disambiguating it from from like engine X of Python, like production app servers, like where your code runs, I think these a little more disambiguation, I'm talking like granian, unicorn, those kinds of things. + +01:01:46.360 --> 01:01:50.220 +Vercorn, Uveacorn now, all those places you run your Python code. + +01:01:50.420 --> 01:01:55.000 +So I kind of went through a debate on those from Michael's context, right? + +01:01:55.260 --> 01:01:57.360 +And then I did the same for Python web frameworks. + +01:01:57.720 --> 01:02:02.080 +And it was, you know, I told the story of the bootstrap and how I just, every time I have + +01:02:02.080 --> 01:02:04.000 +to write new code, I'm like, here we go. + +01:02:04.740 --> 01:02:07.560 +I'm in the relic, right in the relic code. + +01:02:08.700 --> 01:02:13.920 +I kind of felt the same way for, so everything was based on Pyramid and I loved Pyramid and + +01:02:13.920 --> 01:02:15.260 +I still have a lot of respect for it. + +01:02:15.320 --> 01:02:26.780 +The reason I chose Pyramid in 2015 was when I went to the Flask page, it said, you may potentially be able to use Python 3, but we are not supporting it and we don't recommend it. + +01:02:27.410 --> 01:02:31.100 +And I'm like, wait a minute, didn't Python 3 come out in 2008? + +01:02:31.940 --> 01:02:33.280 +That's like seven years later. + +01:02:33.670 --> 01:02:34.200 +You know what? + +01:02:34.740 --> 01:02:36.280 +No, I'm not doing that. + +01:02:37.520 --> 01:02:43.800 +I'm starting this project beyond this problem and I'm not going back to be in the, you know what I mean? + +01:02:44.220 --> 01:02:46.460 +As they've since obviously moved on from that. + +01:02:46.840 --> 01:02:47.920 +So Flask was out. + +01:02:48.100 --> 01:02:50.960 +I looked at Django and I thought, I'm really like a microservice guy. + +01:02:51.220 --> 01:02:52.320 +I really want to use Mongo. + +01:02:52.940 --> 01:02:54.860 +A lot of things were not quite good fits. + +01:02:54.880 --> 01:02:56.440 +They actually would be better fits now, right? + +01:02:56.700 --> 01:02:57.420 +Even then. + +01:02:57.620 --> 01:03:02.060 +Yeah, no, if you want to do Mongo, that's, yeah, that's almost a deal breaker. + +01:03:02.280 --> 01:03:02.400 +Yeah. + +01:03:02.540 --> 01:03:03.060 +Yeah, I know. + +01:03:03.220 --> 01:03:03.360 +Almost. + +01:03:03.660 --> 01:03:05.740 +And so I'm like, all right, well, maybe not Django. + +01:03:06.080 --> 01:03:07.080 +Well, you had a pyramid. + +01:03:07.400 --> 01:03:10.660 +They're like, we are trying to embrace the latest standards. + +01:03:11.220 --> 01:03:13.240 +We're Python 3 first, et cetera, et cetera. + +01:03:13.740 --> 01:03:15.820 +And I'm like, all right, I'm gonna give this a chance, + +01:03:16.170 --> 01:03:17.840 +even though it wasn't as popular, like this is great. + +01:03:17.870 --> 01:03:20.660 +And I used it for eight years, seven years, + +01:03:20.750 --> 01:03:21.740 +something like that, it was really good. + +01:03:22.040 --> 01:03:24.200 +But things evolved over time, right? + +01:03:24.320 --> 01:03:27.100 +Like Pydantic came out and Pydantic was great. + +01:03:27.520 --> 01:03:29.920 +What's a really nice way to talk to the database + +01:03:30.090 --> 01:03:31.960 +with Pydantic, Beanie, okay? + +01:03:32.340 --> 01:03:34.680 +So I can do Beanie and I can do Pydantic and wow, + +01:03:34.920 --> 01:03:37.580 +what a really nice, clever way to write databases. + +01:03:38.200 --> 01:03:40.740 +And oh, Beanie's all async only, + +01:03:41.160 --> 01:03:42.960 +Pyramid's synchronous only. + +01:03:43.820 --> 01:03:45.440 +When was the last commit to Pyramid? + +01:03:45.760 --> 01:03:47.300 +Oh, it was two and a half years ago. + +01:03:47.760 --> 01:03:49.680 +Chances that it gets async support are low + +01:03:49.860 --> 01:03:51.500 +'cause that was just like a minor bug fix. + +01:03:51.630 --> 01:03:51.940 +You know what I mean? + +01:03:52.359 --> 01:03:53.860 +It's just like, it's fine. + +01:03:54.030 --> 01:03:55.920 +Open source projects, they ebb and they flow + +01:03:55.990 --> 01:03:56.860 +and they come and they go. + +01:03:57.210 --> 01:04:01.140 +But I'm just like, I should really move this forward + +01:04:01.320 --> 01:04:03.660 +to something that feels like it's active, right? + +01:04:03.770 --> 01:04:05.260 +I mean, stuff in the web makes me nervous. + +01:04:05.440 --> 01:04:07.900 +I'm always just, did you put a port open on the internet? + +01:04:08.160 --> 01:04:09.020 +Well, that's scary. + +01:04:09.520 --> 01:04:09.660 +- Yep. + +01:04:10.560 --> 01:04:16.400 +And so a framework that felt like things were not on top of it as it could have been made + +01:04:16.400 --> 01:04:16.700 +me nervous. + +01:04:17.200 --> 01:04:21.620 +To be fair, I don't know that they had any security incidents or very, very few because + +01:04:21.740 --> 01:04:22.920 +it did so little, right? + +01:04:23.000 --> 01:04:27.180 +It's not like it had a bunch of admin pages or something where there's like accepting + +01:04:27.360 --> 01:04:29.840 +input, but still, still same reason. + +01:04:29.840 --> 01:04:36.540 +So I'm like, I really want to use these more modern tools, typing, async, identic, et cetera. + +01:04:37.040 --> 01:04:42.200 +And I kind of would not like to continue building on something that feels like it's no longer being supported. + +01:04:42.460 --> 01:04:48.740 +And similarly, you, with chapter 13, sort of that, you know, the different thought process there. + +01:04:49.160 --> 01:04:54.540 +You also provide chapter 15, which is a retrospective on Hetzner, which is the hosting provider that you chose. + +01:04:55.220 --> 01:04:59.560 +and again, I think it's pretty clear. + +01:04:59.690 --> 01:05:01.140 +I think I've said it three different ways. + +01:05:01.560 --> 01:05:08.740 +My favorite stuff in the book really is sort of this, you know, that the little, the little, insight into Michael's brain, right? + +01:05:08.900 --> 01:05:12.260 +Like how did he make this decision and how happy is he with these decisions? + +01:05:12.640 --> 01:05:12.740 +Right. + +01:05:13.100 --> 01:05:18.220 +I think that's the stuff that's, that's, globally applicable to a reader, which is nice. + +01:05:18.690 --> 01:05:22.320 +so you've, it's now even a few months further on with Hetzner. + +01:05:22.540 --> 01:05:24.340 +So you, you still happy? + +01:05:24.690 --> 01:05:25.520 +Any regrets yet? + +01:05:26.160 --> 01:05:27.760 +Yeah, no, no regrets. + +01:05:28.340 --> 01:05:30.360 +It hasn't been absolutely a hundred percent smooth. + +01:05:30.460 --> 01:05:30.820 +Let's see. + +01:05:31.160 --> 01:05:33.840 +I could tell you how long it's been if I can get past all the ads. + +01:05:34.780 --> 01:05:35.120 +There we go. + +01:05:35.510 --> 01:05:38.540 +So I actually blogged about this. + +01:05:38.900 --> 01:05:41.640 +And yeah, so it's been about a year, I guess. + +01:05:42.820 --> 01:05:43.380 +No regrets. + +01:05:43.900 --> 01:05:47.980 +I would say if people are out there looking around, to me, it really, + +01:05:48.170 --> 01:05:50.220 +and you want to follow the philosophy of Michael, + +01:05:50.630 --> 01:05:54.000 +like carve yourself a space out in a multimillion dollar data center + +01:05:54.200 --> 01:05:55.700 +that doesn't have anything to do with it. + +01:05:55.810 --> 01:05:58.340 +And you just run your code in your own space. + +01:05:58.860 --> 01:06:01.660 +DigitalOcean and Hetzner are the two ones. + +01:06:01.730 --> 01:06:03.120 +And I did DigitalOcean for a long time. + +01:06:03.190 --> 01:06:07.520 +When Hetzner came out, I thought they just had some really interesting appeal. + +01:06:07.800 --> 01:06:09.680 +I started seeing a lot of people talking about them. + +01:06:09.820 --> 01:06:12.160 +And they are a German company. + +01:06:12.390 --> 01:06:13.880 +And they were just in Europe. + +01:06:14.150 --> 01:06:17.900 +And I'm like, I love Europe, but the majority of my customers are in the U.S. + +01:06:17.930 --> 01:06:20.620 +So what is the best place for my server? + +01:06:21.200 --> 01:06:25.020 +Probably the east coast of the United States, because that serves the U.S. really well. + +01:06:25.070 --> 01:06:28.580 +But then it's like one hop across the ocean to all of Europe as well. + +01:06:28.760 --> 01:06:32.100 +So it's still really fast from there and so on. + +01:06:32.400 --> 01:06:35.440 +So I didn't want to move my server to Europe + +01:06:36.080 --> 01:06:38.500 +when I felt like being closer to the US was more important. + +01:06:38.640 --> 01:06:40.060 +Not so much because I needed to manage it. + +01:06:40.060 --> 01:06:42.920 +I could SSH to wherever, but just East Coast to the US. + +01:06:43.070 --> 01:06:46.360 +And then they're like, hey, we have two new US data centers. + +01:06:47.930 --> 01:06:50.680 +One near Virginia, right by the US East 1, + +01:06:51.160 --> 01:06:53.220 +the infamous AWS US East 1. + +01:06:53.700 --> 01:06:56.100 +And the other one actually in Hillsborough, Oregon, + +01:06:56.230 --> 01:06:58.300 +just down the street from me, which is funny. + +01:06:58.500 --> 01:07:00.980 +Yeah, it's like I could drive to it in like 20 minutes, + +01:07:02.020 --> 01:07:04.180 +which of all places in the world is relatively close. + +01:07:04.680 --> 01:07:05.960 +So I went and looked at it and I said, + +01:07:06.130 --> 01:07:07.260 +let me just check it out. + +01:07:07.400 --> 01:07:09.160 +And the prices are super cheap. + +01:07:09.210 --> 01:07:11.360 +You get a little bit less support + +01:07:11.720 --> 01:07:15.800 +and I think a little bit less top tier data center + +01:07:16.040 --> 01:07:16.600 +than DigitalOcean, + +01:07:17.000 --> 01:07:19.260 +but the prices are like insane there. + +01:07:19.640 --> 01:07:23.160 +Like I said, eight core server for 30 bucks. + +01:07:23.600 --> 01:07:24.680 +You know, that's insane. + +01:07:25.299 --> 01:07:27.260 +And when I first signed up, + +01:07:27.400 --> 01:07:29.720 +that came with 20 terabytes of free traffic. + +01:07:30.180 --> 01:07:30.440 +Wow. + +01:07:30.780 --> 01:07:35.520 +Which is about $1,700 out of AWS. + +01:07:36.380 --> 01:07:36.520 +Right. + +01:07:36.720 --> 01:07:38.220 +Included in your $30 bill. + +01:07:38.620 --> 01:07:39.000 +You know what I mean? + +01:07:39.140 --> 01:07:40.400 +Like, oh my gosh. + +01:07:41.000 --> 01:07:41.160 +Yeah. + +01:07:41.540 --> 01:07:43.560 +So yeah, I talk a lot about it in the book, + +01:07:43.740 --> 01:07:46.280 +but yeah, I went over and moved my stuff over there + +01:07:46.310 --> 01:07:48.140 +and it's been good. + +01:07:48.540 --> 01:07:50.900 +I've had one incident where the machine + +01:07:51.100 --> 01:07:52.300 +that it was on died. + +01:07:52.760 --> 01:07:54.980 +The one big server, wherever it was, it died + +01:07:55.460 --> 01:07:56.640 +and they had to move it, + +01:07:57.040 --> 01:07:58.020 +which blows my mind, + +01:07:58.440 --> 01:08:01.340 +they were able to hot relocate it to another server. + +01:08:01.870 --> 01:08:03.880 +But the problem is it has an external, + +01:08:04.160 --> 01:08:05.780 +like a 300 gig external drive, + +01:08:06.340 --> 01:08:07.640 +and that didn't move location. + +01:08:07.950 --> 01:08:10.480 +So all of a sudden, a lot of the IO operations + +01:08:10.880 --> 01:08:12.500 +were much slower 'cause they weren't close + +01:08:12.710 --> 01:08:13.520 +to the server anymore. + +01:08:13.800 --> 01:08:14.540 +- Right, right. + +01:08:14.680 --> 01:08:17.040 +- Why, why did my Docker builds take two minutes? + +01:08:17.109 --> 01:08:18.500 +They used to take about three or four seconds. + +01:08:18.589 --> 01:08:19.700 +I cannot figure it out. + +01:08:20.020 --> 01:08:22.020 +And I wrote them, they're like, no, we've tested it. + +01:08:22.080 --> 01:08:22.480 +There's no problem. + +01:08:22.560 --> 01:08:24.700 +I don't care what you say, there's a huge problem. + +01:08:25.520 --> 01:08:30.600 +eventually there's like we moved it again it's fine and then it was fine right so you know if + +01:08:31.540 --> 01:08:36.880 +if folks are looking for something slightly lighter weight and this is going to sound like a commercial + +01:08:37.120 --> 01:08:42.160 +i'm just a happy customer i don't know sponsorship or whatever but i use opal stack with a lot of my + +01:08:42.259 --> 01:08:49.579 +clients um what do you go opal stack opal um yeah and you wouldn't go full docker with it um but + +01:08:49.600 --> 01:08:53.680 +but they do give you access full SSH to the box. + +01:08:54.089 --> 01:08:56.620 +And they've got a neat little sort of packaging thing. + +01:08:56.880 --> 01:08:58.319 +They don't support a lot of things, + +01:08:58.430 --> 01:09:00.000 +but if you've got like Django or Flask + +01:09:00.600 --> 01:09:02.580 +or static files for Nginx or whatever, + +01:09:02.779 --> 01:09:04.040 +you hit a couple of buttons in the dashboard + +01:09:04.660 --> 01:09:05.980 +and it spawns it up. + +01:09:06.279 --> 01:09:08.740 +But a lot of tools like that, it spawns it up + +01:09:08.750 --> 01:09:09.920 +and then you're not allowed to touch it. + +01:09:10.279 --> 01:09:12.680 +What they do is create all the entries in the directories + +01:09:12.880 --> 01:09:14.240 +and then you can SSH into the box + +01:09:14.310 --> 01:09:15.319 +and get at the files themselves. + +01:09:15.569 --> 01:09:18.880 +So I find it's a nice little compromise between the two. + +01:09:19.339 --> 01:09:21.740 +It would not scale to what you're doing. + +01:09:22.620 --> 01:09:28.520 +But if folks are looking for a relatively inexpensive thing to experiment with, I find it's a nice little stopgap. + +01:09:28.839 --> 01:09:29.380 +Yeah, that's awesome. + +01:09:29.720 --> 01:09:32.319 +I'm always interested in finding those types of things. + +01:09:33.180 --> 01:09:33.960 +This one is new to me. + +01:09:34.060 --> 01:09:34.440 +This is cool. + +01:09:34.819 --> 01:09:39.640 +Yeah, this is I'm trying to remember there was a site I used to use. + +01:09:39.710 --> 01:09:40.520 +It got bought. + +01:09:40.940 --> 01:09:46.540 +The half of the founders went, we don't want to be bought and took their baseball bat and created Opal Stack. + +01:09:46.819 --> 01:09:50.120 +So I used to be a client of the original and followed them along. + +01:09:50.460 --> 01:09:50.920 +So yeah. + +01:09:51.180 --> 01:09:51.319 +Cool. + +01:09:51.500 --> 01:09:54.240 +And very happy with like the service as well. + +01:09:54.340 --> 01:09:59.620 +They're like you open a ticket and things are very, very human, which is nice in this day and age. + +01:09:59.630 --> 01:10:02.260 +You usually, I'm usually expecting to talk to a bot. + +01:10:02.540 --> 01:10:04.780 +You're getting about as much support as you get out of Gmail. + +01:10:05.240 --> 01:10:05.860 +Yeah, exactly. + +01:10:06.400 --> 01:10:08.100 +Google Docs, which is done. + +01:10:08.860 --> 01:10:15.599 +Another thing worth a shout out here is sort of an alternate way of working with Docker and Docker Compose directly + +01:10:15.620 --> 01:10:18.780 +that I propose in the book is something called Coolify. + +01:10:18.900 --> 01:10:19.920 +Are you familiar with Coolify? + +01:10:20.300 --> 01:10:21.220 +No, this one I don't know. + +01:10:21.480 --> 01:10:22.640 +Yeah, this is super interesting. + +01:10:22.940 --> 01:10:27.620 +So what this does is it knows how to run Docker, Docker Compose, + +01:10:28.020 --> 01:10:31.560 +but it also gives you all sorts of easier ways. + +01:10:31.780 --> 01:10:33.180 +So if people look at what I'm proposing, + +01:10:33.580 --> 01:10:35.100 +they're like, no, Michael, too complicated. + +01:10:35.380 --> 01:10:37.360 +This is interesting because what it gives you + +01:10:37.360 --> 01:10:40.160 +is it basically gives you your own private Heroku + +01:10:40.620 --> 01:10:43.140 +or Netlify or Vercel or Railway, + +01:10:43.540 --> 01:10:45.440 +or you can go in, I don't know how to find it, + +01:10:45.560 --> 01:10:48.260 +from here, but you can also go in and say, + +01:10:48.660 --> 01:10:50.440 +let me find any self-hosted app. + +01:10:50.840 --> 01:10:51.120 +- Okay. + +01:10:51.260 --> 01:10:52.680 +- And they've got hundreds of them in there. + +01:10:52.700 --> 01:10:54.140 +And then you just type in the name and say, + +01:10:54.620 --> 01:10:57.140 +install this set of Docker containers + +01:10:57.300 --> 01:10:59.180 +as a Docker compose file into my server. + +01:10:59.640 --> 01:11:01.260 +So you could create the one big server, + +01:11:01.360 --> 01:11:04.040 +which is your own space in someone's cloud. + +01:11:04.480 --> 01:11:06.280 +And then you can install this + +01:11:06.280 --> 01:11:07.740 +or you can pay them five bucks a month + +01:11:07.840 --> 01:11:09.300 +and they'll actually manage the server, + +01:11:10.080 --> 01:11:11.100 +manage the deployments, + +01:11:11.660 --> 01:11:15.160 +do like rollouts of new versions of your app. + +01:11:15.520 --> 01:11:21.940 +stuff like that right it sounds like it sounds like it makes it way easier right it actually makes it + +01:11:22.260 --> 01:11:28.900 +it's like two steps forward 1.8 steps backwards right because you know instead of using dot env + +01:11:29.180 --> 01:11:35.080 +files you've got these like a ui to enter a bunch of environment variables and the saving of them + +01:11:35.100 --> 01:11:39.900 +is weird and you're like oh i forgot to press save on these three even though i saved the page i mean + +01:11:39.820 --> 01:11:45.440 +there's just right right it's it promises more ease than you would think and i'm not necessarily + +01:11:45.690 --> 01:11:50.000 +switch i do like i've played with it some i'm not saying i would switch to it given a choice + +01:11:50.520 --> 01:11:55.620 +but it does it does ease you in it's a little bit like python anywhere like i'm sure when i started + +01:11:55.670 --> 01:11:59.560 +there were things that could have gotten in my way but the stuff the support that it gave me made it + +01:12:00.020 --> 01:12:04.539 +possible for me to feel like comfortable and get going i feel like this might be an option for + +01:12:04.560 --> 01:12:05.480 +people who care. + +01:12:05.880 --> 01:12:05.980 +Right. + +01:12:06.460 --> 01:12:07.820 +But let me give you an example. + +01:12:07.940 --> 01:12:14.920 +For example, I could go install an app that has Postgres, Valkey, and the web app. + +01:12:15.120 --> 01:12:20.140 +If I, then I just click install that from wherever self-hosted definition that comes + +01:12:20.280 --> 01:12:21.720 +from, it creates those three containers. + +01:12:22.200 --> 01:12:26.280 +And then on the container setting through the image setting, I don't know really how + +01:12:26.300 --> 01:12:26.800 +you think of it. + +01:12:26.980 --> 01:12:31.019 +I mean, they're not, I guess it's image sort of, you can go to the web part and say, and + +01:12:31.040 --> 01:12:35.780 +just use this URL and it'll automatically do the SSL generation as part of that. + +01:12:36.140 --> 01:12:39.580 +Then you go to the database, the Postgres thing, you say, oh, and make backups for + +01:12:39.580 --> 01:12:43.480 +me daily and store them in this S3 compatible storage. + +01:12:43.980 --> 01:12:47.400 +And that kind of stuff is a lot of extra when you're doing it yourself and you + +01:12:47.420 --> 01:12:48.420 +just go check those boxes. + +01:12:48.700 --> 01:12:51.940 +So that's the one point forward, that's the two forward, but then there's the + +01:12:52.060 --> 01:12:52.400 +step back. + +01:12:52.620 --> 01:12:52.680 +Yeah. + +01:12:52.740 --> 01:12:52.840 +Yeah. + +01:12:52.980 --> 01:12:55.840 +Well, and that tends to be also what makes people nervous, right? + +01:12:56.000 --> 01:13:02.480 +So like that, and that's, you know, I still use managed database simply because I don't want to + +01:13:02.480 --> 01:13:06.860 +have to think about it, right? Like it's like, yeah, okay. I'm perfectly fine with pointing my + +01:13:07.060 --> 01:13:11.820 +app at a managed database and let somebody else think about backing it up and all the rest of it. + +01:13:12.020 --> 01:13:16.840 +Yeah. Yeah. You know, one thing about managed databases that I don't like, and I can't speak + +01:13:16.920 --> 01:13:21.599 +to all potential hosts of them, but certainly some of them, some well-known ones, some names I've + +01:13:21.620 --> 01:13:26.260 +already said, if you get a managed database there, that database server is listening on the public + +01:13:26.520 --> 01:13:33.820 +internet. I very much do not espouse having a database listening. Yeah, it has a password, + +01:13:34.080 --> 01:13:38.480 +but I mean, that's that database. I'm always worried about what is in the database. + +01:13:38.640 --> 01:13:41.220 +That's interesting. I've never thought to even check that. + +01:13:41.560 --> 01:13:46.840 +And on my setup, not only is it not listening on the internet, it's not even listening on the + +01:13:46.840 --> 01:13:52.680 +host there's like a private docker network that only things right in the docker shared docker + +01:13:52.880 --> 01:13:58.260 +network can even know about or see the data you know what i mean so there's yep it's it's less + +01:13:58.480 --> 01:14:04.140 +likely to fewer holes but i have to make backups and if i don't it's bad if it goes down it's real + +01:14:04.260 --> 01:14:09.820 +bad so and it did one time this year i was down for like 10 minutes about a hair out there goes + +01:14:09.800 --> 01:14:11.160 +- That was your sixth nine, yes. + +01:14:11.620 --> 01:14:12.680 +- Exactly, I know. + +01:14:13.500 --> 01:14:17.020 +So the problem was just for people who wanna benefit + +01:14:17.160 --> 01:14:18.900 +from my suffering and not suffer themselves, + +01:14:19.120 --> 01:14:23.440 +is I did not, on the Docker pull for the database image, + +01:14:23.480 --> 01:14:25.140 +I didn't pin it to a major version. + +01:14:26.060 --> 01:14:27.260 +And so it upgraded and then it said, + +01:14:27.340 --> 01:14:28.980 +well, you have old data in your file system + +01:14:29.360 --> 01:14:30.980 +and we're not gonna upgrade it for you automatically. + +01:14:31.320 --> 01:14:32.120 +So we're not gonna run. + +01:14:32.180 --> 01:14:33.880 +I'm like, why is the database server not running? + +01:14:33.980 --> 01:14:36.160 +It just, and it was like a weird update. + +01:14:36.160 --> 01:14:39.360 +It was like 8.2.1 that broke. + +01:14:39.580 --> 01:14:39.780 +Well, why? + +01:14:40.520 --> 01:14:40.960 +Point one. + +01:14:41.940 --> 01:14:46.460 +Surely, surely a bigger number needs to be like, this will never run again. + +01:14:49.200 --> 01:14:51.540 +Anyway, you know, you find the stuff out the hard way, but yes. + +01:14:51.970 --> 01:14:55.800 +Well, yeah, that that's the negative, not using a managed database. + +01:14:56.360 --> 01:14:56.540 +Yes. + +01:14:56.900 --> 01:14:56.960 +Yeah. + +01:14:57.050 --> 01:14:57.160 +Yeah. + +01:14:57.270 --> 01:14:59.260 +Cause you have to deal with some of that kind of stuff yourself. + +01:14:59.780 --> 01:14:59.880 +Yeah. + +01:15:00.070 --> 01:15:04.620 +So I thought we would wrap up by reviving an old tradition. + +01:15:04.870 --> 01:15:06.240 +I have two questions for you. + +01:15:06.620 --> 01:15:09.500 +What is, what, what, what is your development environment? + +01:15:09.560 --> 01:15:11.780 +and what library are you excited about? + +01:15:12.040 --> 01:15:14.520 +So the development environment right now + +01:15:15.120 --> 01:15:20.380 +is a mix of Cursor and PyCharm for sort of editing. + +01:15:21.080 --> 01:15:24.840 +And despite this very detailed conversation about Docker, + +01:15:24.980 --> 01:15:28.200 +I don't use Docker very much locally for development. + +01:15:28.240 --> 01:15:29.460 +I just use virtual environments. + +01:15:29.600 --> 01:15:30.980 +And I want to give a shout out to Hynek, + +01:15:31.280 --> 01:15:33.160 +who I had some back and forth about + +01:15:33.200 --> 01:15:34.400 +when I was writing some of this stuff + +01:15:34.520 --> 01:15:36.080 +that gave me some really good ideas. + +01:15:36.160 --> 01:15:37.700 +And he has a really good article, + +01:15:37.860 --> 01:15:38.840 +which I referenced in the book, + +01:15:38.980 --> 01:15:41.000 +about you just use virtual environments. + +01:15:41.660 --> 01:15:42.760 +Keep everything consistent, right? + +01:15:42.880 --> 01:15:45.360 +That's an interesting debate that we don't have time for, + +01:15:45.400 --> 01:15:45.960 +but it's very fun. + +01:15:46.340 --> 01:15:48.500 +So uv, I'm a huge fan of uv. + +01:15:50.400 --> 01:15:52.800 +- Particularly in Docker, that makes things that much faster. + +01:15:53.260 --> 01:15:55.220 +- Yeah, because you can just say in your Docker, + +01:15:55.440 --> 01:15:57.260 +it used to be you're like, okay, well, + +01:15:57.370 --> 01:15:59.020 +I gotta use Docker and I need to use Python. + +01:15:59.220 --> 01:16:01.700 +So let me use the official Python distribution for Docker + +01:16:01.940 --> 01:16:02.820 +because I need to have Python. + +01:16:03.260 --> 01:16:08.200 +And then, well, that excludes 99.9% + +01:16:08.220 --> 01:16:10.000 +have all the other things you could build upon + +01:16:10.320 --> 01:16:12.200 +that already have something that's harder to manage + +01:16:12.860 --> 01:16:13.800 +set up for you, right? + +01:16:14.280 --> 01:16:15.960 +But in your Docker file, you just say, + +01:16:16.160 --> 01:16:20.340 +run uvvenv-python3.14, + +01:16:20.680 --> 01:16:23.200 +you've installed Python 3.14 in two seconds. + +01:16:23.820 --> 01:16:24.820 +And it's cached, right? + +01:16:24.960 --> 01:16:27.620 +It's like, yeah, it just, it makes it so much faster + +01:16:27.900 --> 01:16:30.820 +and so powerful, but also just in general, right? + +01:16:30.900 --> 01:16:33.180 +Like it's unified so many tools that I like + +01:16:33.300 --> 01:16:34.400 +that are just, it's all there together. + +01:16:34.960 --> 01:16:36.540 +And then library, oh my goodness. + +01:16:37.080 --> 01:16:38.180 +- Now you know how it feels + +01:16:38.220 --> 01:16:45.300 +like i know how it feels and now i didn't warn you on purpose i love it i love it so there's + +01:16:45.840 --> 01:16:49.460 +there are a bunch of ones i've been playing with lately and i'm trying to think which one i've + +01:16:49.760 --> 01:16:54.320 +i've used i don't really have a great answer to this chris i'm i'm afraid to say i would say + +01:16:54.680 --> 01:16:59.340 +let's keep it let's keep it flowing with some of the vibes that we had here i would say + +01:16:59.780 --> 01:17:06.920 +let me give a shout out to set proc title which there you go sounds insanely silly like the goal + +01:17:06.940 --> 01:17:10.740 +of that is so in your process and your Python process and I actually use this + +01:17:10.820 --> 01:17:14.680 +on a bunch of different things in your Python process you can say set proc + +01:17:14.920 --> 01:17:18.560 +title dot set title and you give it the name of whatever you want your process + +01:17:18.720 --> 01:17:24.500 +to be so why does that matter when you pull up all these tools like glances be + +01:17:24.720 --> 01:17:28.220 +top or others anything that looks at processes basically instead of seeing + +01:17:28.380 --> 01:17:31.980 +Python Python Python node node node postgres postgres postgres at least the + +01:17:32.020 --> 01:17:36.900 +Python ones now have meaningful names and you might be thinking well Michael + +01:17:36.920 --> 01:17:38.560 +that much production, useless to me. + +01:17:38.860 --> 01:17:40.260 +No, it's good for development too. + +01:17:40.620 --> 01:17:43.220 +Have you ever had the idea, + +01:17:43.360 --> 01:17:45.480 +like I wanna know how much memory my process is using. + +01:17:46.020 --> 01:17:47.000 +Is it using a lot or a little? + +01:17:47.200 --> 01:17:49.240 +So you pull up, you know, activity monitor, + +01:17:49.440 --> 01:17:52.120 +task manager, whatever, you see Python, Python, Python, + +01:17:52.300 --> 01:17:55.000 +you're like, oh man, I know my editor's using one of these + +01:17:55.420 --> 01:17:57.040 +or whatever, but which one is it? + +01:17:57.160 --> 01:18:00.400 +- And if you're using the right terminal, + +01:18:00.640 --> 01:18:02.300 +it'll change the terminal's title too, + +01:18:02.570 --> 01:18:04.620 +because most terminals respond to the proc name. + +01:18:05.220 --> 01:18:10.240 +Oh, that's a very nice touch. Yeah. Okay. Yeah. So, but if you, if you do that in development, + +01:18:10.390 --> 01:18:15.080 +if you just set the name of your process to be like, you know, my utility or whatever the heck + +01:18:15.080 --> 01:18:19.640 +you call it, right. Then when you go into process management tools, like even just for Mac or + +01:18:19.770 --> 01:18:25.200 +windows or whatever, you'll see it and you can see how much CPU is it using? Is it using a lot of RAM? + +01:18:25.570 --> 01:18:29.600 +If you've got to end task it, like we now have another way reason that this is something we've + +01:18:29.600 --> 01:18:35.160 +got to do all the time is the, sometimes the agentic AI things go mad and they start a bunch + +01:18:35.180 --> 01:18:39.640 +servers and then they lose track of them and then you can't run anymore because it says um port is + +01:18:39.640 --> 01:18:45.840 +in use you're like but where like something in that stream of text that shot by for five minutes + +01:18:45.980 --> 01:18:51.540 +it started one and then it left it going but then you pull it up it says python python python and + +01:18:51.660 --> 01:18:56.180 +like well i don't want to kill the other thing that's running you know what i mean and so it also + +01:18:56.280 --> 01:19:01.260 +gives you a way to kill off your ai abandoned stuff that it went mad on so there you go setting + +01:19:01.280 --> 01:19:06.200 +a process name might save you a reboot. There's your little nugget to take away from the podcast. + +01:19:06.520 --> 01:19:10.520 +Exactly. It's a package with one function, but it's a good one. + +01:19:11.020 --> 01:19:16.520 +Excellent. Well, thank you for having me on. This has been fun to sort of reverse the tables on you. + +01:19:16.720 --> 01:19:17.560 +It's been great. + +01:19:18.060 --> 01:19:21.680 +Yeah. Chris, thank you so much. I really appreciate it. And always great to catch up with you. Bye. + +01:19:21.680 --> 01:19:22.300 +It's been fun to be here. + +01:19:23.640 --> 01:19:27.740 +This has been another episode of Talk Python To Me. Thank you to our sponsors. Be sure to check + +01:19:27.760 --> 01:19:32.160 +out what they're offering. It really helps support the show. Look into the future and see bugs before + +01:19:32.250 --> 01:19:38.120 +they make it to production. Sentry's Seer AI code review uses historical error and performance + +01:19:38.380 --> 01:19:43.620 +information at Sentry to find and flag bugs in your PRs before you even start to review them. + +01:19:44.300 --> 01:19:50.500 +Stop bugs before they enter your code base. Get started at talkpython.fm/seer-code-review. + +01:19:51.340 --> 01:19:57.520 +Agency. Discover agentic AI with agency. Their layer lets agents find, connect, and work together. + +01:19:57.880 --> 01:20:04.020 +any stack, anywhere. Start building the internet of agents at talkpython.fm/agency spelled + +01:20:04.280 --> 01:20:10.780 +A-G-N-T-C-Y. If you or your team needs to learn Python, we have over 270 hours of beginner and + +01:20:10.960 --> 01:20:17.100 +advanced courses on topics ranging from complete beginners to async code, Flask, Django, HTMX, + +01:20:17.160 --> 01:20:23.040 +and even LLMs. Best of all, there's no subscription in sight. Browse the catalog at talkpython.fm. + +01:20:23.720 --> 01:20:27.100 +And if you're not already subscribed to the show on your favorite podcast player, + +01:20:27.500 --> 01:20:28.380 +What are you waiting for? + +01:20:29.080 --> 01:20:30.820 +Just search for Python in your podcast player. + +01:20:30.920 --> 01:20:31.800 +We should be right at the top. + +01:20:32.240 --> 01:20:33.640 +If you enjoyed that geeky rap song, + +01:20:33.880 --> 01:20:35.120 +you can download the full track. + +01:20:35.220 --> 01:20:37.120 +The link is actually in your podcast player show notes. + +01:20:37.920 --> 01:20:39.280 +This is your host, Michael Kennedy. + +01:20:39.680 --> 01:20:40.740 +Thank you so much for listening. + +01:20:40.940 --> 01:20:41.740 +I really appreciate it. + +01:20:42.180 --> 01:20:42.880 +I'll see you next time. + +01:20:55.920 --> 01:20:57.560 +Thank you. + diff --git a/transcripts/532-python-2025-year-in-review.txt b/transcripts/532-python-2025-year-in-review.txt new file mode 100644 index 0000000..82b9233 --- /dev/null +++ b/transcripts/532-python-2025-year-in-review.txt @@ -0,0 +1,2242 @@ +00:00:00 Python in 2025 is a delightfully refreshing place. + +00:00:04 The guild's days are numbered, packaging is getting sharper tools, + +00:00:07 and the type checkers are multiplying like gremlins snacking after midnight. + +00:00:11 On this episode, we have an amazing panel to give us a range of perspectives + +00:00:15 on what mattered in 2025 in Python. + +00:00:19 We have Barry Warsaw, Brett Cannon, Gregory Kampfhammer, Jody Burchell, Reuven Lerner, + +00:00:24 and Thomas Worders on the show to give us their thoughts. + +00:00:28 This is Talk Python To Me, episode 532, recorded December 9th, 2025. + +00:00:50 Welcome to Talk Python To Me, the number one Python podcast for developers and data scientists. + +00:00:55 This is your host, Michael Kennedy. + +00:00:57 I'm a PSF fellow who's been coding for over 25 years. + +00:01:01 Let's connect on social media. + +00:01:03 You'll find me and Talk Python on Mastodon, BlueSky, and X. + +00:01:06 The social links are all in your show notes. + +00:01:09 You can find over 10 years of past episodes at talkpython.fm. + +00:01:12 And if you want to be part of the show, you can join our recording live streams. + +00:01:16 That's right. + +00:01:17 We live stream the raw uncut version of each episode on YouTube. + +00:01:20 Just visit talkpython.fm/youtube to see the schedule of upcoming events. + +00:01:25 Be sure to subscribe there and press the bell so you'll get notified anytime we're recording. + +00:01:29 Look into the future and see bugs before they make it to production. + +00:01:33 Sentry's SEER AI code review uses historical error and performance information at Sentry + +00:01:39 to find and flag bugs in your PRs before you even start to review them. + +00:01:43 Stop bugs before they enter your code base. + +00:01:46 Get started at talkpython.fm/seer-code-review. + +00:01:50 Hey, before we jump into the interview, I just want to send a little message to all the companies out there with products and services trying to reach developers. + +00:01:59 That is the listeners of this show. + +00:02:01 As we're rolling into 2026, I have a bunch of spots open. + +00:02:05 So please reach out to me if you're looking to sponsor a podcast or just generally sponsor things in the community. + +00:02:12 And you haven't necessarily considered podcasts. + +00:02:14 You really should reach out to me and I'll help you connect with the Talk Python audience. + +00:02:20 thanks everyone for listening all of 2025 and here we go into 2026 cheers hey everyone it's so + +00:02:27 awesome to be here with you all thanks for taking the time out of your day to be part of talk python + +00:02:31 for this year in review this python year in review so yeah let's just jump right into it gregory + +00:02:38 welcome welcome to the show welcome back to the show how you doing hi i'm an associate professor + +00:02:42 of computer and information science and i do research and software engineering and software + +00:02:46 testing. I've built a bunch of Python tools, and one of the areas we're studying now is flaky test + +00:02:51 cases in Python projects. I'm also really excited about teaching in a wide variety of areas. In fact, + +00:02:57 I use Python for operating systems classes or theory of computation classes. And one of the + +00:03:02 things I'm excited about is being a podcast host. I'm also a host on the Software Engineering Radio + +00:03:08 podcast sponsored by the IEEE Computer Society, and I've had the cool opportunity to interview a + +00:03:14 whole bunch of people in the Python community. So Michael, thanks for welcoming me to the show. + +00:03:18 Yeah, it's awesome to have you back. And we talked about FlakyTest last time. I do have to say + +00:03:23 your AV setup is quite good. I love the new mic and all that. Thomas, welcome. Awesome to have + +00:03:29 you here. Thanks for having me. I'm Thomas Wauters. I'm a longtime Python core developer, + +00:03:34 although not as long as one of the other guests on this podcast. I worked at Google for 17 years. + +00:03:40 for the last year or so I've worked at Meta. In both cases, I work on Python itself within the + +00:03:45 company and just deploying it internally. I've also been a board member of the PSF, although I'm not + +00:03:51 one right now. And I've been a steering council member for five years and currently not because + +00:03:58 the elections are going and I don't know what the result is going to be. But I think there's like + +00:04:02 five, six chance that I'll be on the steering council since we only have six candidates for + +00:04:09 five positions when this episode probably airs. I don't know. That's quite the contribution to the + +00:04:14 whole community. Thank you. I always forget this. I also got the, what is it, the Distinguished + +00:04:19 Service Award from the PSF this year. I should probably mention that. So yes, I have been + +00:04:24 recognized. No need to talk about it further. Wonderful. Wonderful. Jody, welcome back on the + +00:04:29 show. Awesome to catch up with you. Yeah, thanks for having me back. I am a data scientist and + +00:04:34 developer advocate at JetBrains working on PyCharm. And I've been a data scientist for around 10 + +00:04:39 years. And prior to that, I was actually a clinical psychologist. So that was my training, + +00:04:45 my PhD, but abandoned academia for greener pastures. Let's put it that way. + +00:04:50 Noah Franz-Gurig. + +00:04:53 Brett, hello. Good to see you. + +00:04:55 Hello. Yes. Yeah, let's see here. I've been at Microsoft for 10 years. I started doing, + +00:05:00 working on AI R&D for Python developers. + +00:05:03 Also keep Wazzy running for Python here and do a lot of internal consulting for teams outside. + +00:05:09 I am actually the shortest running core developer on this call, amazingly, + +00:05:14 even though I've been doing it for 22 years. + +00:05:16 I've also only gotten the Frank Wilson award, not the DSA. + +00:05:19 So I feel very under accomplished here as a core developer. + +00:05:22 Yeah, that's me in a nutshell. + +00:05:24 Otherwise, I'm still trying to catch that. + +00:05:26 Most quoted. + +00:05:27 Yeah, most quoted. + +00:05:29 I will say actually at work, it is in my email footer that I'm a famous Python quotationist. + +00:05:33 That was Anthony Shaw's suggestion, by the way. + +00:05:35 That was not mine, but does link to the April Fool's joke from last year. + +00:05:39 And I am still trying to catch Anthony Shaw, I think, on appearances on this podcast. + +00:05:43 Well, plus one. + +00:05:44 Anthony Shaw should be here, honestly. + +00:05:46 I mean, I put it out into Discord. + +00:05:48 He could have been here, but probably at an odd time. + +00:05:51 You used to work on VS Code a bunch on the Python aspect of VS Code. + +00:05:54 You recently changed roles, right? + +00:05:57 Not recently. + +00:05:57 That was, I used to be the dev manager. + +00:05:59 Every seven years, years ago. + +00:06:01 Yeah, September of 2024. + +00:06:03 So it's been over a year. + +00:06:04 But yeah, I used to be the dev manager. + +00:06:04 That counts as recent for me. + +00:06:06 Yes, I used to be the dev manager for the Python experience in VS Code. + +00:06:09 Okay, very cool. + +00:06:10 That's quite a shift. + +00:06:11 Yeah, it went back to being an IC basically. + +00:06:13 You got all, you're good at your TPS reports again now? + +00:06:17 Actually, I just did do my connect, so I kind of did. + +00:06:19 Awesome. + +00:06:20 Reuven, I bet you haven't filed a TPS report in at least a year. + +00:06:23 So yeah, I'm Reuven. + +00:06:24 I'm a freelance Python and Pandas trainer. + +00:06:28 I just celebrated this past week 30 years since going freelance. + +00:06:32 So I guess it's working out okay. + +00:06:35 We'll know at some point if I need to get a real job. + +00:06:38 I teach Python Pandas both at companies and on my online platform. + +00:06:41 I have newsletters. + +00:06:42 I've written books, speaking conferences, and generally try to help people improve their + +00:06:47 Python and Pandas fluency and confidence and have a lot of fun with this community as well + +00:06:51 as with the language. + +00:06:52 Oh, good to have you here. + +00:06:53 Barry, it's great to have a musician on the show. + +00:06:56 Thanks. + +00:06:58 Yeah, I've got my bases over here. + +00:07:00 So, you know, if you need to be serenaded. + +00:07:02 Yeah, like a Zen of Python may break out at any moment. + +00:07:05 You never know when it's going to happen. + +00:07:07 Thanks for having me here. + +00:07:08 Yeah, I've been a core developer for a long time, since 1994. + +00:07:13 And I've been, you know, in the early days, I did tons of stuff for Python.org. + +00:07:20 I worked with Guido at CNRI and we moved everything from the mailing, the Postmaster stuff, + +00:07:27 and the version control systems back in the day, websites, all that kind of stuff. + +00:07:32 I try to not do any of those things anymore. + +00:07:35 There's way more competent people doing that stuff now. + +00:07:39 I have been a release manager. + +00:07:42 I'm currently back on the steering council and running again. + +00:07:47 between Thomas and I, we'll see who makes it to six years, I guess. + +00:07:52 And I'm currently working for NVIDIA, and I do all Python stuff. + +00:07:58 Some half and half, roughly, of internal things and external open source community work, + +00:08:05 both in packaging and in core Python. + +00:08:08 That's, I guess, I think that's about it. + +00:08:10 Yeah, you all are living in exciting tech spaces, that's for sure. + +00:08:14 That's for sure. + +00:08:15 For sure. + +00:08:16 Yeah. Well, great to have you all back on the show. Let's start with our first topic. So the + +00:08:20 idea is we've each picked at least a thing that we think stood out in 2025 in the Python space + +00:08:27 that we can focus on. And let's go with Jody first. I'm excited to hear what you thought was + +00:08:33 one of the bigger things. I'm going to mention AI. Like, wow, what a big surprise. So to kind of + +00:08:40 give context of where I'm coming from, I've been working in NLP for a long time. I like to say I was + +00:08:45 working on LLMs before they were cool. So sort of playing around with the very first releases from + +00:08:50 Google in like 2019, incorporating that into search. So I've been very interested sort of + +00:08:56 seeing the unfolding of the GPT models as they've grown. And let's say slightly disgusted by the + +00:09:03 discourse around the models as they become more mainstream, more sort of the talk about people's + +00:09:09 jobs being replaced, a lot of the hysteria, a lot of the doomsday stuff. So I've been doing talks + +00:09:15 and other content for around two and a half years now, just trying to cut through the hype a bit, + +00:09:19 being like, you know, they're just language models, they're good for language tasks. Let's think about + +00:09:23 realistically what they're about. And what was very interesting for me this year, I've been + +00:09:29 incorrectly predicting the bubble bursting for about two and a half years. So I was quite vindicated + +00:09:34 when in August, GPT-5 came out, and all of a sudden, everyone else started saying, + +00:09:40 maybe this is a bubble. + +00:09:41 Don't you think that was the first big release that was kind of a letdown compared to what the hype was? + +00:09:47 Yeah, and it was really interesting. + +00:09:48 So I found this really nice Atlantic article, and I didn't save it, unfortunately, + +00:09:52 but essentially it told sort of the whole story of what was going on behind the scenes. + +00:09:58 So GPT-4 came out in March of 2023, and that was the model that came out + +00:10:03 with this Microsoft research paper saying, you know, sparks of AGI, artificial general intelligence, + +00:10:08 blah, blah, blah. And from that point, there was really this big expectation sort of fueled by + +00:10:15 OpenAI that GPT-5 was going to be the AGI model. And it turns out what was happening internally + +00:10:22 is these scaling laws that were sort of considered, you know, this exponential growth thing that would + +00:10:27 sort of push the power of these models perhaps towards human-like performance. They weren't + +00:10:33 laws at all. And of course they started failing. So the model that they had originally pitched as + +00:10:38 GPT-4 just didn't live up to performance. They started this post-training stuff where they were + +00:10:43 going more into like specialized reasoning models. And what we have now are good models that are good + +00:10:48 for specific tasks, but I don't know what happened, but eventually they had to put the GPT-5 label on + +00:10:54 something. And yeah, let's say it didn't live up to expectations. So I think the cracks are starting + +00:11:01 to show because the underlying expectation always was this will be improving to the point where + +00:11:08 anything's possible and you can't put a price on that. But it turns out that if maybe there's a + +00:11:14 limit on what's possible, yeah, you can put a price on it. And a lot of the valuations are on the + +00:11:19 first part. Yes. And it's always been a bit interesting to me because I come from a scientific + +00:11:24 background and you need to know how to measure stuff, right? And I'm like, what are you trying + +00:11:28 to achieve? Like Gregory's nodding, like, please jump in. I'm on my monologue, so please don't + +00:11:34 interrupt me. You really need to understand what you're actually trying to get these models to do. + +00:11:38 What is AGI? No one knows this. And what's going to be possible with this? And it's more science + +00:11:46 fiction than fact. So this for me has been the big news this year, and I'm feeling slightly smug, + +00:11:51 I'm going to be honest, even though my predictions were off by about a year and a half. + +00:11:55 Yeah, maybe it's not an exponential curve. + +00:11:56 It's a titration S curve with an asymptote. + +00:11:59 We'll see. + +00:12:00 Yeah, sigmoid. + +00:12:01 Yeah. + +00:12:02 Yeah, yeah, yeah. + +00:12:03 I mean, I think we have to sort of separate the technology from the business. + +00:12:07 And the technology, even if it doesn't get any better, even if we stay with what we have today, + +00:12:13 I still think this is like one of the most amazing technologies I've ever seen. + +00:12:18 It's not a god. + +00:12:19 It's not a panacea. + +00:12:21 But it's like a chainsaw that if you know how to use it, it's really effective. + +00:12:25 but in the hands of amateurs, you can really get hurt. And so, yes, it's great to see this sort of + +00:12:32 thing happening and improving, but who knows where it's going to go. And I'm a little skeptical of + +00:12:36 the AGI thing. What I'm a little more worried about is that these companies seem to have no + +00:12:40 possible way of ever making the money that they're promising to their investors. And I do worry a lot + +00:12:47 that we're sort of like a year 2000 situation where, yeah, the technology is fantastic, + +00:12:53 But the businesses are unsustainable. And out of the ashes of what will happen, we will get some amazing technology and even better than we had before. But there are going to be ashes. + +00:13:03 For me, that also makes me worry. And I don't know if anyone reads Ed Zitron here. He's a + +00:13:09 journalist kind of digging into the state of the AI industry. He does get a bit sort of, + +00:13:15 his reputation is a bit of a crank now. So I think he's leaned into that pretty hard, + +00:13:20 but he does take the time to also pull out numbers and point out things that don't make sense. + +00:13:25 And he was one of the first ones to sound the whistle on this circular funding we've been seeing. + +00:13:30 So the worry, of course, is when a lot of this becomes borrowings from banks and then that starts dragging in funding from everyday people. + +00:13:41 And also the effect that this has had on particularly the U.S. economy, like the stock market. + +00:13:45 I think the investment in AI spending now exceeds consumer spending in the U.S., which is a really scary prospect. + +00:13:54 That is crazy. + +00:13:55 Mm-hmm. But yeah, also as Reven said, I love LLMs. They are the most powerful tools we've + +00:14:02 ever had for natural language processing. It's phenomenal the problems we can solve with them + +00:14:06 now. I didn't think this sort of stuff would be possible when I started in data science. + +00:14:10 I still think there's a use case for agents, although I do think they've been a bit overstated, + +00:14:16 especially now that I'm building them. Let's say it's not very fun building + +00:14:20 non-deterministic software. It's quite frustrating, actually. But I hope we're going to see improvements + +00:14:25 in the framework, particularly I've heard good things about Pydantic AI. And yeah, hopefully we + +00:14:30 can control the input outputs and make them a bit more strict. This will fix a lot of the problems. + +00:14:36 One thing I do want to put out in this conversation, I think is worth separating. And Reuven, + +00:14:41 you touched on this some. I want to suggest to you, I'll throw this out to you all and see what + +00:14:45 you think. I think it's very possible that this AI bubble crashes the economy and causes bad things + +00:14:51 economically to happen and a bunch of companies that are like wrappers over open ai api go away + +00:14:58 but i don't think things like the agentic coding tools will vanish they might stop training they + +00:15:03 might slow their advance because that's super expensive but i even as if you said even if we + +00:15:09 just had claude sonnet 4 and the world never got something else it would be so much far farther + +00:15:16 beyond autocomplete and the other stuff that we had before and stack overflow that it's i don't + +00:15:20 think it's going to go. The reason I'm throwing this out there is I was talking to somebody and + +00:15:23 they were like, well, I don't think it's worth learning because I think the bubble is going to + +00:15:26 pop. And so I don't want to learn this agent at coding because it won't be around very long. + +00:15:30 What do you all think? It's here to stay. I think it's just, where's the limit? Where does it stop? + +00:15:34 I think that's the big open question for everybody, right? Like pragmatically, it's a tool. It's useful + +00:15:40 in some scenarios and not in others. And you just have to learn how to use the tool appropriately + +00:15:44 for your use case and to get what you need out of it. And sometimes that's not using it because + +00:15:47 it's just going to take up more time than it will to be productive. But other times it's + +00:15:51 fully juices up your productivity and you can get more done. It's give and take. But I don't think + +00:15:56 it's going to go anywhere because as you said, Michael, there's even academics doing research now. + +00:16:00 There's open weight models as well. There's a lot of different ways to run this, whether you're + +00:16:05 at the scale of the frontier models that are doing these huge trainings or you're doing something + +00:16:11 local and more specialized. So I think the general use of AI isn't going anywhere. I think it's just + +00:16:16 the question of how far can this current trend go and where will it be i want to say stop that's + +00:16:22 because that plays into the whole it's never it's going to completely go away i don't think it ever + +00:16:25 will i think it's just going to be where where are we going to start to potentially bump up against + +00:16:29 limits one thing that i'll say is that many of these systems are almost to me like a dream come + +00:16:33 true now admittedly it's the case that the systems i'm building are maybe only tens of thousands of + +00:16:38 lines or hundreds of thousands of lines but i can remember thinking to myself how cool would it be + +00:16:44 if I had a system that could automatically refactor + +00:16:47 and then add test cases and increase the code coverage + +00:16:51 and make sure all my checkers and linters pass and do that automatically and continue the process + +00:16:57 until it achieved its goal. + +00:16:59 And I remember thinking that five to seven years ago, + +00:17:01 I would never realize that goal in my entire lifetime. + +00:17:05 And now when I use like anthropics models through open code or Claude code, + +00:17:10 it's incredible how much you can achieve so quickly, + +00:17:13 even for systems that are of medium to moderate scale. + +00:17:17 So from my vantage point, it is a really exciting tool. + +00:17:20 It's incredibly powerful. + +00:17:21 And what I have found is that the LLMs are much better + +00:17:24 when I teach them how to use tools and the tools that it's using + +00:17:29 are actually really quick, fast ones that can give rapid feedback to the LLM + +00:17:34 and tell it whether it's moving in the right direction or not. + +00:17:36 Yeah, there's an engineering angle to this. + +00:17:39 It's not just Vibe Coding if you take the time to learn it. + +00:17:42 There was actually a very interesting study. I don't think the study itself has been released. + +00:17:48 I haven't found it yet, but I saw a talk on it by some guys at Stanford. So they call it the 10K + +00:17:53 developer study. And basically what they were doing was studying real code bases, including, + +00:17:59 I think 80% of them were actually private code bases and seeing the point where the team started + +00:18:05 adopting AI. And so their findings are really interesting and nuanced. And I think they probably + +00:18:10 intuitively align with what a lot of us have experienced with AI. So basically, yes, there + +00:18:16 are productivity boosts, but it produces a lot of code, but the code tends to be worse than the code + +00:18:21 you would write and also introduces more bugs. So when you account for the time that you spend + +00:18:27 refactoring and debugging, you're still more productive. But then it also depends on the + +00:18:32 type of project, as Gregory was saying. So it's better for greenfield projects, it's better for + +00:18:36 smaller code bases. It's better for simpler problems and it's better for more popular languages because + +00:18:41 obviously there's more training data. And so this was actually, I like this study so much. I'll + +00:18:46 actually share it with you, Michael, if you want to put it in the show notes, but it shows that, + +00:18:50 yeah, the picture is not that simple and all this conflicting information and conflicting experiences + +00:18:54 people were having line up completely with this. So again, like I work at an IDE company, it's tools + +00:19:00 for the job. It's not like your IDE will replace you. AI is not going to replace you. It's just + +00:19:06 going to make you maybe more productive sometimes. + +00:19:08 Yeah. + +00:19:09 Wait, IDE, you work for me. + +00:19:11 Right. + +00:19:12 It's not about you. + +00:19:13 But then I work for the IDE. + +00:19:18 This portion of Talk Python To Me is brought to you by Sentry. + +00:19:22 Let me ask you a question. + +00:19:24 What if you could see into the future? + +00:19:26 We're talking about Sentry, of course. + +00:19:28 So that means seeing potential errors, crashes, and bugs before they happen, before you even + +00:19:33 accept them into your code base. + +00:19:35 That's what Sentry's AI Sears Code Review offers. + +00:19:39 You get error prediction based on real production history. + +00:19:43 AI Sear Code Review flags the most impactful errors your PR is likely to introduce before merge + +00:19:50 using your app's error and performance context, not just generic LLM pattern matching. + +00:19:55 Sear will then jump in on new PRs with feedback and warning if it finds any potential issues. + +00:20:01 Here's a real example. + +00:20:03 On a new PR related to a search feature in a web app, we see a comment from seer bicenturybot in the PR. + +00:20:11 And it says, potential bug, the process search results function, can enter an infinite recursion when a search query finds no matches. + +00:20:19 As the recursive call lacks a return statement and a proper termination condition. + +00:20:24 And Seer AI Code Review also provides additional details which you can expand for further information on the issue and suggested fixes. + +00:20:32 And bam, just like that, Seer AI Code Review has stopped a bug in its tracks without any + +00:20:37 devs in the loop. + +00:20:38 A nasty infinite loop bug never made it into production. + +00:20:42 Here's how you set it up. + +00:20:43 You enable the GitHub Sentry integration on your Sentry account, enable Seer AI on your + +00:20:49 Sentry account, and on GitHub, you install the Seer by Sentry app and connect it to your + +00:20:53 repositories that you want it to validate. + +00:20:55 So jump over to Sentry and set up Code Review for yourself. + +00:20:59 Just visit talkpython.fm/seer-code-review. + +00:21:03 The link is in your podcast player show notes and on the episode page. + +00:21:06 Thank you to Sentry for supporting Talk Python and me. + +00:21:10 I mean, the other thing is a lot of people and a lot of the sort of when people talk about AI + +00:21:15 and LLMs and so forth in context of coding, it's the LLM writing code for us. + +00:21:20 And maybe because I'm not doing a lot of serious coding, + +00:21:23 it's more instruction and so forth. + +00:21:25 I use it as like a sparring or brainstorming partner So it does, you know, checking of my newsletters for language and for tech edits and just sort of exploring ideas. + +00:21:37 And for that, maybe it's because I do everything in the last minute and I don't have other people around or I'm lazy or cheap and don't want to pay them. + +00:21:43 But definitely the quality of my work has improved dramatically. + +00:21:46 The quality of my understanding has improved, even if it never wrote a line of code for me. + +00:21:50 Just getting that feedback on a regular automatic basis is really helpful. + +00:21:54 Yeah, I totally agree with you. + +00:21:55 All right. We don't want to spend too much time on this topic, even though I believe Jody has put her finger on what might be the biggest tidal wave of 2025. + +00:22:06 But still, a quick parting thoughts. Anyone else? + +00:22:08 I'm glad I'll never have the right bash from scratch ever again. + +00:22:12 Tell me about it. + +00:22:13 Yeah. + +00:22:15 I'll just say from anecdotally, the thing that I love about it is when I need to do something + +00:22:22 and I need to go through docs, online docs for whatever it is, you know, it might be + +00:22:28 GitLab or some library that I want to use or something like that. + +00:22:32 I never even search for the docs. + +00:22:33 I never even try to read the docs anymore. + +00:22:35 I just say, hey, you know, whatever model I need to set up this website. + +00:22:41 And I just, just tell me what to do or just do it. + +00:22:43 And it's an immense time saver and productivity. + +00:22:47 And then it gets me bootstrapped to the place where now I can start to be creative. + +00:22:52 I don't have to worry about just like digging through pages and pages and pages of docs to + +00:22:57 figure out one little setting here or there. + +00:23:00 That's an amazing time saver. + +00:23:02 Yeah, that's a really good point. + +00:23:03 Another thing that I have noticed, there might be many things for which I had a really good + +00:23:07 mental model, but my brain can only store so much information. + +00:23:10 So for example, I know lots about the abstract syntax tree for Python, but I forget that + +00:23:16 sometimes. + +00:23:16 And so it's really nice for me to be able to bring that back into my mind quickly with + +00:23:21 an LLM. + +00:23:22 And if it's generating code for me that's doing a type of AST parsing, I can tell whether + +00:23:27 that's good code or not because I can refresh that mental model. + +00:23:30 So in those situations, it's not only the docs, but it's something that I used to know + +00:23:35 really well that I have forgotten some of. + +00:23:37 And the LLM often is very powerful when it comes to refreshing my memory and helping me to get started and move more quickly. + +00:23:44 All right. Out of time, I think. Let's move on to Brett. What do you got, Brett? + +00:23:48 Well, I actually originally said we should talk about AI, but Jody had a way better pitch for it than I did because my internal pitch was a little bit AI. + +00:23:56 Do I actually have to write a paragraph explaining why? Then Jody actually did write the paragraph. So she did a much better job than I did. + +00:24:01 So the other topic I had was using tools to run your Python code. + +00:24:06 And what I mean by that is traditionally, if you think about it, + +00:24:10 you install the Python interpreter, right? + +00:24:13 Hopefully you create a virtual environment, install your dependencies, + +00:24:16 and you call the Python interpreter in your virtual environment to run your code. + +00:24:19 Those are all the steps you went through to run stuff. + +00:24:21 But now we've got tools that will compress all that into a run command, + +00:24:25 just do it all for you. + +00:24:26 And it seems like the community has shown a level of comfort with that, + +00:24:31 that I'd say snuck up on me a little bit, but I would say that I think it's a good thing, right? + +00:24:37 It's showing us, I'm going to say us, as the junior core developer here on this call, + +00:24:43 as to, sorry to make you too feel old, but admittedly, Barry did write my letter of recommendation + +00:24:48 to my master's program. + +00:24:51 So what happened was like, yeah, we had Hatch and PDM, + +00:24:55 poetry before that, and uv as of last year, all kind of come through + +00:24:59 and all kind of build on each other and take ideas from each other + +00:25:02 and kind of just slowly build up this kind of repertoire of tool approaches + +00:25:06 that they all kind of have a baseline kind of, not synergy is the right word, + +00:25:09 but share just kind of approach to certain things with their own twists and added takes on things. + +00:25:15 But in general, this whole like, you know what, you can just tell us to run this code + +00:25:19 and we will just run it, right? + +00:25:20 Like inline script metadata coming in and help making that more of a thing. + +00:25:24 Disclaimer, I was the PEP delegate for getting that in. + +00:25:27 But I just think that's been a really awesome trend And I'm hoping we can kind of leverage that a bit. + +00:25:34 Like I have personal plans that we don't need to go into here, + +00:25:36 but like I'm hoping as a Python core team, we can kind of like help boost this stuff up a bit + +00:25:41 and kind of help keep a good baseline for this for everyone. + +00:25:43 Because I think it's shown that Python is still really good for beginners. + +00:25:45 You just have to give them the tools to kind of hide some of the details + +00:25:49 to not shoot yourself in the foot and still leads to a great outcome. + +00:25:52 Yeah, 2025 might be the year that the Python tools stepped outside of Python. + +00:25:56 Instead of being, you install Python and then use the tools. + +00:26:00 You do the tool to get Python, right? + +00:26:02 Like uv and PDM and others. + +00:26:03 Yeah, and inverted the dependency graph in terms of just how you put yourself in, right? + +00:26:08 I think the interesting thing is these tools treat Python as an implementation detail almost, right? + +00:26:13 Like when you just say uv or hatch run or PDM run thing, + +00:26:17 these tools don't make you have to think about the interpreter. + +00:26:19 It's just a thing that they pull in to make your code run, right? + +00:26:22 It's not even necessarily something you have to care about if you choose not to. + +00:26:26 And it's an interesting shift in that perspective, at least for me. + +00:26:30 But I've also been doing this for a long time. + +00:26:31 I think you're really onto something. + +00:26:33 And what I love at sort of a high level is this, I think there's a renewed focus on the user experience. + +00:26:40 And like uv plus the PEP 723, the inline metadata, you know, you can put uv in the shebang line of your script. + +00:26:49 And now you don't have to think about anything. + +00:26:52 You get uv from somewhere, and then it takes care of everything. + +00:26:57 And Hatch can work the same way, I think, for developers. + +00:27:01 But this renewed focus on installing your Python executable, + +00:27:08 you don't really have to think about, because those things are very complicated, + +00:27:12 and people just want to hit the ground running. + +00:27:14 And so if you think about the previous discussion about AI, + +00:27:18 I just want things to work. + +00:27:20 I know what I want to do. + +00:27:22 I can see it. + +00:27:23 I can see the vision of it. + +00:27:24 And I just don't want to. + +00:27:25 An analogy is like when I first learned Python and I came from C++ and all those languages. + +00:27:32 And I thought, oh my gosh, just to get like, hello world, + +00:27:35 I have to do a million little things that I shouldn't have to do. + +00:27:39 Like create a main and get my braces right and get all my variables right and get my pound includes correct. + +00:27:46 And now I don't have to think about any of that stuff. + +00:27:49 And the thing that was eye-opening for me with Python was the distance between vision of what I wanted and working code just really narrowed. + +00:28:00 And I think that as we are starting to think about tools and environments and how to bootstrap all this stuff, we're also now taking all that stuff away. + +00:28:09 Because people honestly don't care. + +00:28:11 I don't care about any of that stuff. + +00:28:13 I just want to go from like, I woke up this morning and had a cool idea and I just wanted to get at work. + +00:28:18 Or you wanted to share it so you could just share the script and you don't have to say, + +00:28:22 here's your steps that you get started with. + +00:28:24 Exactly. + +00:28:25 Exactly. + +00:28:26 I want to thank the two of you for, oh, sorry, sorry, go ahead. + +00:28:28 I'm just going to say, like, for years teaching Python that how do we get it installed? + +00:28:34 At first, it surprised me how difficult it was for people. + +00:28:37 Because like, oh, come on, we just got Python. + +00:28:39 Like, what's so hard about this? + +00:28:40 But it turns out it's a really big barrier to entry for newcomers. + +00:28:45 And I'm very happy that Jupyter Lite now has solved its problems with input. + +00:28:49 And it's like huge. + +00:28:50 But until now, I hadn't really thought about starting with uv because it's cross-platform. + +00:28:57 And if I say to people in the first 10 minutes of class, install uv for your platform and + +00:29:02 then say uv in it, your project, bam, you're done. + +00:29:05 It just works. + +00:29:06 And then it works cross-platform. + +00:29:07 This is mind-blowing. + +00:29:08 And I'm going to try this at some point. + +00:29:10 Thank you. + +00:29:10 I can comment on the mind-blowing part because now when I teach undergraduate students, we + +00:29:14 start with uv in the very first class. And it is awesome. There were things that would take students, + +00:29:20 even very strong students who've had lots of experience, it would still take them a week to + +00:29:26 set everything up on their new laptop and get everything ready and to understand all the key + +00:29:30 concepts and know where something is in their path. And now we just say, install uv for your + +00:29:36 operating system and get running on your computer. And then, hey, you're ready to go. And I don't have + +00:29:42 teach them about docker containers and i don't have to tell them how to install python with some + +00:29:47 package manager all of those things just work and i think from a learning perspective whether you're + +00:29:52 in a class or whether you're in a company or whether you're teaching yourself uv is absolutely + +00:29:58 awesome i'm actually wondering whether i am the one who is newest to python here i taught myself + +00:30:05 python in 2011 so i was like python 2.7 stage but it was my first programming language i was just + +00:30:12 procrastinating during my PhD. And I was like, I should learn to program. So I just taught myself + +00:30:18 Python. And I can tell you, you do not come from an engineering background. And you're like, + +00:30:23 what is Python? What is Python doing? Why am I typing Python to execute this hello world? And + +00:30:29 if you're kind of curious, you get down a rabbit hole before you even get to the point where you're + +00:30:33 just focusing on learning the basics. And so it's exactly, I was going to say with Reuven, + +00:30:39 And like whether you thought about it for teaching, because we're now debating for Humble Data, + +00:30:43 which is a beginner's data science community that I'm part of, whether we switch to uv. + +00:30:48 This was Chuck's idea because it does abstract away all these details. + +00:30:52 The debate I have is, is it too magic? + +00:30:55 This is kind of the problem because I also remember learning about things like virtual + +00:30:59 environments, because again, this was my first programming language and being like, + +00:31:02 oh, it's a very good idea. + +00:31:04 This is best practices. + +00:31:05 And it's also a debate we have in PyCharm, right? + +00:31:08 Like how much do you magic away the fundamentals versus making people think a little bit, but + +00:31:15 I'm not sure. + +00:31:15 All right. + +00:31:15 Like, would you even let somebody run without a virtual environment? + +00:31:19 That's like, that's a take you, that's a stance you could take. + +00:31:21 I used to when I first learned Python, because it was too complicated, but then I learned + +00:31:27 better. + +00:31:28 But yes. + +00:31:29 The consideration here is like hiding the magic isn't like hiding the details and having + +00:31:34 all this magic just work is great as long as it works. + +00:31:38 And the question is, how is it going to break down and how are people going to know how to deal + +00:31:43 when it breaks down, if you hide all the magic? And I think virtual envs were, or let's say before + +00:31:49 we had virtual envs, installing packages was very much in the, you had to know all the details + +00:31:54 because it was very likely going to break down in some way right before we had virtual envs, + +00:32:00 because you would end up with weird conflicts or multiple copies of a package installed in + +00:32:04 different parts of the system. When we got virtual ends, we sort of didn't have to worry about that + +00:32:10 anymore because we were trained in that you can just blow away the virtual one and it just works. + +00:32:14 And with uv, we're back into, this looks like a single installation. We don't know what's going + +00:32:19 to go on, but we've learned, we as a community and also the people working on uv, we have learned + +00:32:25 from those earlier mistakes or not, maybe not mistakes, but consequences of the design. + +00:32:32 And they have created something that is, that appears to be very stable where it's unlikely + +00:32:38 the magic will break. + +00:32:39 And when the magic does break, it's obvious what the problem is or, or it automatically + +00:32:44 fixes itself. + +00:32:45 So like it's not reusing, broken, installations and that kind of thing. + +00:32:50 So the risk now, as it turns out, I think as is proven by the community adopting uv so + +00:32:57 fast and so willingly, I think it's acceptable. + +00:33:00 Well, I think it's, yeah, I think it's proven itself. + +00:33:02 It's clear that this is, it's worth the potential of discovering weird edge cases later, both + +00:33:09 because it's probably low likelihood, but also the people behind uv Astral have proven that + +00:33:16 they would jump in and fix those issues, right? + +00:33:18 They would do anything they need to keep uv workable the same way. + +00:33:23 And they have a focus that Python as a whole cannot have because they cater to fewer use + +00:33:28 cases than Python as a whole needs to. + +00:33:31 On the audience, Galano says, as an enterprise tech director in Python coder, I believe we + +00:33:36 should hide the magic which empowers the regular employee to do simple things that make their + +00:33:40 job easier. + +00:33:41 Yeah. + +00:33:41 This notion of abstractions, right, has always been there in computer science. + +00:33:47 And, you know, we've used tools or languages or systems where we've tried to bring that + +00:33:53 abstraction layer up so that we don't have to think about all these details, as I mentioned + +00:33:58 before. The question is, that's always the happy path. And when I'm trying to teach somebody + +00:34:04 something like, here's how to use this library or here's how to use this tool, I try to be very + +00:34:09 opinionated to keep people on that happy path. Like, assume everything's going to work just right. + +00:34:15 Here's how you just make you go down that path to get the thing done that you want. The question + +00:34:19 really is when things go wrong, how narrow is that abstraction? And are you able, and even when + +00:34:27 you're just curious, like what's really going on underneath the hood? Of course, that's not a really + +00:34:31 good analogy today because cars are basically computers on wheels that you can't really + +00:34:36 understand how they work. But back in your day. But back in my day, we were changing spark plugs, + +00:34:42 you know, but and crank that window down. Exactly. So I think we always have to leave that + +00:34:50 room for the curious and the bad path where when things go wrong or when you're just like, + +00:34:56 you know what, I understand how this works, but I'm kind of curious about what's really going on. + +00:35:01 How easy is it for me to dive in and get a little bit more of that background, you know, + +00:35:07 a little bit more of that understanding of what's going on. I want the magic to decompose, + +00:35:12 right like you should be able to explain the magic path via a more decomposed steps using the tool + +00:35:17 all the way down to what the tools like to do behind the scenes just just to admit i the reason + +00:35:21 i brought this up and i've been thinking about this a lot is i'm thinking of trying to get the python + +00:35:26 launcher to do a bit more because one interesting thing we haven't really brought up here is we're + +00:35:31 all seeing uv uv uv uv is a company there's always there's they might disappear and we haven't + +00:35:36 de-risked ourselves from that. Now we do have Hatch, we do have PDM, but as I said, there's kind + +00:35:41 of a baseline I think they all share that I think they would be probably okay if the Python launcher + +00:35:45 just did because that's based on standards, right? Because that's the other thing that there's been + +00:35:48 a lot of work that has led to this step, right? Like we've gotten way more packaging standards, + +00:35:52 we've got PEP 723, like we mentioned. There's a lot of work that's come up to lead to this point + +00:35:58 that all these tools can lean on to have them all have an equivalent outcome because it's expected + +00:36:03 is how they should be. + +00:36:05 And so I think it's something we need to consider of how do we make sure, + +00:36:09 like, by the way, uv, I know the people, they're great. + +00:36:12 I'm not trying to spares them or think they're going to go away, + +00:36:14 but it is something we have to consider. + +00:36:16 And I will also say, Jody, I do think about this for teaching + +00:36:20 because I'm a dad now and I don't want my kid coming home + +00:36:24 when they get old enough to learn Python and go, hey, dad, + +00:36:26 why is getting Python code running so hard? + +00:36:29 So I want to make sure that that never happens. + +00:36:32 But they fall in love with it from the start. + +00:36:34 I realized something for the 2026 year interview. + +00:36:37 I have to bring a sign that says time for next topic because we got a bunch of topics and we're running low on time. + +00:36:43 So, Thomas, let's jump over to yours. + +00:36:46 Oh, and I had two topics as well. + +00:36:48 So I'm only going to have to pick my favorite child, right? + +00:36:52 That's terrible. + +00:36:53 My second favorite child is Lazy Imports, which is a relatively new development. + +00:36:58 So we'll probably not get to that. + +00:37:00 And just accepted. + +00:37:00 Yes, it's been accepted and it's going to be awesome. + +00:37:03 So I'll just give that a shout out and then move to my favorite child, which is free threaded + +00:37:06 Python. + +00:37:07 For those who were not aware, the global interpreter lock is going away. + +00:37:12 I am stating it as a fact. + +00:37:13 It's not actually a fact yet, but it, you know, that's because the steering council hasn't + +00:37:18 realized the fact yet. + +00:37:20 It is trending towards. + +00:37:22 Well, I was originally on the steering council that accepted the proposal to add free threading + +00:37:28 as a, as an experimental feature, we had this idea of adding it as experimental and then making it + +00:37:34 supported, but not the default and then making it the default. And it was all a little vague and, + +00:37:39 and up in the air. And then I didn't get reelected for the steering council last year, + +00:37:44 which I was not sad about at all. I sort of ran on a, well, if there's nobody better, I'll do it, + +00:37:49 but otherwise I have other things to do. And it turns out those other things were making sure that + +00:37:54 prefer that Python landed in a supported state. So I lobbied the steering council quite hard, + +00:38:00 as Barry might remember at the start of the year, to get some movement on this, like get some + +00:38:04 decision going. So for Python 3.14, it is officially supported. The performance is great. It's like + +00:38:10 between a couple of percent slower and 10% slower, depending on the hardware and the compiler that + +00:38:16 you use. It's basically the same speed on macOS, which is really like it's, that's a combination of + +00:38:23 the ARM hardware and Clang specializing things, but it's basically the same speed, which, wow. + +00:38:29 And then on recent GCCs on Linux, it's like a couple of percent slower. The main problem is + +00:38:35 really community adoption, getting third-party packages to update their extension modules for + +00:38:40 the new APIs and the things that by necessity sort of broke, and also supporting free threading in a + +00:38:48 in a good way and in packages for Python code, it turns out there's very few changes that + +00:38:54 need to be made for things to work well under free threading. + +00:38:57 They might not be entirely thread safe, but usually like almost always in cases where it + +00:39:02 wasn't thread saved before either, because the guild doesn't actually affect thread safety. + +00:39:07 Just the likelihood of things breaking. + +00:39:09 I do think there's been a bit of a, the mindset of the Python community hasn't really been + +00:39:14 focused on creating thread safe code because the GIL is supposed to protect us. + +00:39:18 But soon as it takes multiple steps, then all of a sudden it's just less likely. + +00:39:22 It's not that it couldn't happen. + +00:39:23 Yeah, that's my point, right? + +00:39:24 It's not the GIL never gave you threat safety. + +00:39:27 The GIL gave cpythons internals threat safety. + +00:39:31 It never really affected Python code and it very rarely affected thread safety in + +00:39:36 extension modules as well. + +00:39:37 So they already had to take care of, of making sure that the global interpreter + +00:39:41 couldn't be released by something that they ended up calling indirectly so it's actually not that + +00:39:46 hard to port most things to support free threading and the benefits we've seen some experimental + +00:39:53 work because you know it's still it's still new there's still a lot of things that don't + +00:39:56 quite support it there's still places where thread contention slows things things down a lot but + +00:40:02 we've seen a lot of examples of really promising very parallel problems that now speed up by 10x or + +00:40:09 more. And it's going to be really excited in the future. And it's in 2025 that this all started. + +00:40:15 I mean, Sam started it earlier, but he's been working on this for years, but it landed in 2025. + +00:40:21 It dropped its experimental stage in 314, basically. Yeah. I was going to say, were we all, + +00:40:27 the three of us on the steering council at the same time when we decided to start the experiment + +00:40:30 for free threading? I think Barry wasn't on it. Yeah, I missed a couple of years there, but I'm + +00:40:36 Not sure. + +00:40:36 No, I totally agree. + +00:40:37 I think free threading is one of the most transformative developments for Python, certainly since Python 3, but even maybe more impactful because of the size of the community today. + +00:40:49 Personally, you know, not necessarily speaking as a current or potentially former steering council member. + +00:40:56 We'll see how that shakes out. + +00:40:58 But I think it's inevitable. + +00:41:00 I think free threading is absolutely the future of Python, and I think it's going to unlock incredible potential and performance. + +00:41:08 I think we just have to do it right. + +00:41:10 And so I talked to lots of teams who are building various software all over the community. + +00:41:16 And I actually think it's more of an educational and maybe an outreach problem than it is a technological problem. + +00:41:23 I mean, yes, there are probably APIs that are missing that will make people's lives easier. + +00:41:30 There's probably some libraries that will make other code a little easier to write or whatever or to understand. + +00:41:36 But like all that's solvable. + +00:41:38 And I think really reaching out to the teams that are, you know, like Thomas said, + +00:41:42 that are building the ecosystem, that are moving the ecosystem to a free threading world. + +00:41:47 That's where we really need to spend our effort on. + +00:41:50 And we'll get there. + +00:41:51 It won't be that long. + +00:41:52 It certainly won't be as long as it took us to get to Python 3. + +00:41:56 I'm sort of curious as someone who's not super experienced with threading or, you know, basic concurrency. + +00:42:04 I mean, I've used it, but I feel like now we have threads, especially with free threading and sub interpreters and multiprocessing and asyncio. + +00:42:14 And I feel like for many people now it's like, oh, my God, which one am I supposed to use? + +00:42:19 And for someone who's experienced, you can sort of say, well, this seems like a better choice. + +00:42:24 But are there any plans to sort of try to have a taxonomy of what problems are solved by which of these? + +00:42:31 The premise here is that everyone would be using one or more of these low-level techniques that you mentioned. + +00:42:37 And I think that's not a good way of looking at it. + +00:42:40 Like AsyncIO is a library that you want to use for the things that AsyncIO is good at. + +00:42:46 And you can actually very nicely combine it with multiprocessing, with subprocesses, with + +00:42:52 so that subprocesses and subinterpreters, just to make it clear that those are two very separate + +00:42:57 things and multithreading, both with and without free threading. + +00:43:01 And it solves different problems or it gives you different abilities within the AsyncIO + +00:43:06 framework. + +00:43:06 And the same is true for like GUI frameworks. + +00:43:09 I mean, GUI frameworks usually want threads for multiple reasons, but you can use these + +00:43:14 other things as well. + +00:43:15 I don't think it's down to teaching end users when to use or avoid all these different things. + +00:43:22 I think we need higher level abstractions for tasks that people want to solve. + +00:43:27 And then those can decide on what for their particular use case is a better approach. + +00:43:33 For instance, PyTorch has multiple. + +00:43:36 So it's used for people who don't know to train, not just train, but it's used in AI + +00:43:42 for generating large matrices and LLMs and what have you. + +00:43:46 Part of it is loading data and processing. + +00:43:49 And the basic ideas of AsyncIO are, oh, you can do all these things in parallel + +00:43:55 because you're not waiting on the CPU, you're just waiting on IO. + +00:43:58 Turns out it is still a good idea to use threads for massively parallel IO + +00:44:02 because otherwise you end up waiting longer than you need to. + +00:44:05 So a problem where we thought AsyncIO would be the solution + +00:44:10 and we never needed threads is actually much improved if we tie in threads as well. + +00:44:15 And we've seen massive, massive improvements in data loader. + +00:44:19 There's even an article, a published article from some people at Meta + +00:44:24 showing how much they improve the PyTorch data loader by using multiple threads. + +00:44:29 But at a very low level, we don't want end users to need to make that choice, right? + +00:44:33 I concur to futures is a good point, right? + +00:44:35 Like all of these approaches are all supported there and it's a unified one. + +00:44:39 So if you were to teach this, for instance, you could say use concurrent.tot futures. + +00:44:42 These are all there. + +00:44:44 This is the potential tradeoff. + +00:44:45 Like basically use threads. + +00:44:46 It's going to be the fastest unless there's like some module you have that's not that's + +00:44:51 screwing up because of threads, then use sub interpreters. + +00:44:53 And if for some reason sub interpreters don't work, you should move to the processing pool, + +00:44:57 the process pool. + +00:44:58 But I mean, basically, you just kind of just like, it's not go sort the fast stuff. + +00:45:02 And for some reason, it doesn't work. + +00:45:03 Use the next fastest and just kind of do it that way. + +00:45:05 After that, then you start to the lower level. + +00:45:08 Like, okay, why do I want to use subinterpreters instead of threads? + +00:45:11 Those kinds of threads. + +00:45:12 But I think that's a different, as I think we're all searching, + +00:45:15 a different level of abstraction, which is a term we keep bringing up today. + +00:45:19 It's a level that a lot of people are not going to have to care about. + +00:45:21 I think the libraries are the ones that are going to have to care about this + +00:45:23 and who are going to do a lot of this for you. + +00:45:25 Let me throw this out on our way out the door to get to Reuven's topic. + +00:45:29 I would love to see it solidify around async and await. + +00:45:33 And you just await a thing, maybe put a decorator on something. + +00:45:36 say this, this one, I want this to be threaded. I want this to be IO. I want this. And you don't, + +00:45:42 you just use async and await and don't have to think about it, but that's, that's my dream. + +00:45:46 Reuven, what's your dream? + +00:45:48 Wow. How long do you have? + +00:45:51 No, what's your topic? + +00:45:52 So I want to talk about Python ecosystem and funding. When I talk to people with Python + +00:45:59 and I talk to them about it, how it's open source, they're like, oh, right, it's open source. That + +00:46:01 means I can download it for free. And from their perspective, that's sort of where it starts + +00:46:05 and ends. And the notion that people work on it, the notion that needs funding, the notion that + +00:46:10 there's a Python software foundation that supports a lot of these activities, the infrastructure + +00:46:15 is completely unknown to them and even quite shocking for them to hear. But Python is in many + +00:46:21 ways, I think, starting to become a victim of its own success, that we've been dependent on + +00:46:27 companies for a number of years to support developers and development. And we've been + +00:46:32 assuming that the PSF, which gives money out to lots of organizations to run conferences and + +00:46:38 workshops and so forth, can sort of keep scaling up and that they will have enough funding. And + +00:46:43 we've seen a few sort of shocks that system in the last year. Most recently, the PSF announced that + +00:46:48 it was no longer going to be doing versus sort of pared down about a year ago, what it would give + +00:46:52 money for. And then about five months ago, six months ago, I think it was in July or August, + +00:46:56 they said, actually, we're not going to be able to fund anything for about a year now. + +00:47:00 And then there was the government grant, I think from the NSF that they turned down. And I'm not disputing the reasons for that at all. It basically, it said, well, we'll give you the money if you don't worry about diversity and inclusion. And given that that's like a core part of what the PSF is supposed to do, they could not do that without shutting the doors, which would be kind of counterproductive. + +00:47:17 And so I feel like we're not yet there, but we're sort of approaching this, I'm going to term like a problem crisis in funding Python. + +00:47:27 The needs of the community keep growing and growing, whether it's workshops, whether it's PyPI, whether it's conferences. + +00:47:32 And companies are getting squeezed. + +00:47:35 And the number of people, it always shocks me every time there are PSF elections, the incredibly small number of people who vote. + +00:47:42 Which means that, let's assume half the people who are members, third of the people. + +00:47:46 Like for the millions and millions of people who program Python out there, an infinitesimally + +00:47:50 small proportion of them actually join and help to fund it. + +00:47:53 So I'm not quite sure what to do with this other than express concern. + +00:47:57 But I feel like we've got to figure out ways to fund Python and the PSF in new ways that + +00:48:01 will allow it to grow and scale as needed. + +00:48:04 I couldn't agree more. + +00:48:06 Obviously, the PSF is close to my heart because I was on the board for, I think, a total of + +00:48:11 six or nine years or something over, you know, the last 25. + +00:48:15 I was also for six months, I was the interim general manager because Eva left and we hadn't + +00:48:21 hired Deb yet while I was on the board. + +00:48:23 I remember signing the sponsor contracts for the companies that came in wanting to sponsor + +00:48:29 Python. + +00:48:29 And it is like, it's ridiculous how, and I can say this working for a company that is + +00:48:35 one of the biggest sponsors of the PSF and has done so for years. + +00:48:38 It's ridiculous how small those sponsorships are and yet how grateful we were that they + +00:48:45 came in because every single one has such a big impact. + +00:48:48 You can do so much good with the money that comes in. + +00:48:52 We need more corporate sponsorships more than we need. + +00:48:55 Like, I mean, obviously a million people giving us a couple of bucks, giving the PSF, let's + +00:49:00 be clear. + +00:49:00 I'm not on the board anymore. + +00:49:01 Giving the PSF a couple of bucks would be fantastic. + +00:49:05 But I think the big players in the big corporate players where all the AI money is, for instance, + +00:49:12 having done basically no sponsorship of the PSF is mind-boggling. It is a textbook + +00:49:18 tragedy of the commons right there, right? They rely entirely on PyPI and PyPI is run entirely + +00:49:25 with community resources, mostly because of very generous and consistent sponsorship, + +00:49:31 basically by Fastly, but also the other sponsors of the PSF. And yet very large players use those + +00:49:38 resources more than anyone else and don't actually contribute. Georgie Kerr, she wrote this fantastic + +00:49:45 blog post saying pretty much this straight after Europython. So Europython this year was really big + +00:49:52 actually. And she was wandering around looking at the sponsor booths and the usual players were there, + +00:49:57 but none of these AI companies were there. And the relationship actually between AI, if you want to + +00:50:03 call it that. Let's call it ML and neural networks. And like some of the really big companies and + +00:50:09 Python actually is really complex. Obviously, a lot of these companies and some of us are here, + +00:50:15 employ people to work on Python. Companies like Meta and Google have contributed massively to + +00:50:21 frameworks like PyTorch, TensorFlow, Keras. So it's not as simple a picture as saying cough up money + +00:50:27 all the time. Like there's a more complex picture here, but definitely there are some notable + +00:50:32 absences. And we talked about the volume of money going through. I totally agree with the sentiment. + +00:50:39 When the shortfall came and the grants program had to shut down, we were brainstorming at JetBrains, + +00:50:45 like maybe we can do some sort of, I don't know, donate some more money and call other companies + +00:50:51 to do it. Or we can call on people in the community. And I was like, I don't want to call + +00:50:55 on people in the community to do it because they're probably the same people who are also + +00:50:59 donating their time for Python. Like it's just squeezing people who give so much of themselves + +00:51:05 to this community even more. And it's not sustainable. Like Reuben said, if we keep doing + +00:51:10 this, the whole community is going to collapse. Like I'm sure we've all had our own forms of + +00:51:16 burnout from giving too much. I'm going to pat ourselves on the back here. Everyone on this + +00:51:19 call who works at a company are all sponsors of the PSF. Thank goodness. But there's obviously a + +00:51:25 lot of people not on this call who are not sponsors. And I know personally, I wished every + +00:51:29 company that generated revenue from python usage donated to the psf like and it doesn't see and i + +00:51:35 think part of the problem is some people think it has to be like a hundred thousand dollars it does + +00:51:38 not have to be a hundred thousand dollars now if you can afford that amount please do or more there + +00:51:43 are many ways to donate more than the maximum amount for getting on the website but it's one + +00:51:48 of these funny things where a lot of people just like oh it's not me right like even startups don't + +00:51:51 some do to give those ones credit but others don't because like oh we're we're burning through + +00:51:56 capital level i was like yeah but we're not we're asking for like less so you'd pay a dev right by + +00:52:01 a lot per year right like the amount we actually asked for to get to the highest tier is still less + +00:52:07 than a common developer in silicon valley if we're gonna price point to a geograph geogra geographical + +00:52:13 location we call kind of comprehend i'm gonna steal a net bachelor's observation here and yeah what + +00:52:18 the psf would be happy with is less than a medium company spends on the tips of expensed meals every + +00:52:25 year. Yeah. Yeah. And it's a long running problem, right? Like, I mean, I've been on the PSF for a + +00:52:31 long time, too. I've not served as many years as Thomas on the board, but I was like executive + +00:52:36 vice president because we had to have someone with that title at some point. It's always been a + +00:52:40 struggle, right? Like I and I also want to be clear, I'm totally appreciative of where we have + +00:52:45 gotten to, right? Because for the longest time, I was just dying for paid staff on the core team. + +00:52:50 And now we have three developers as residents. Thank goodness. Still not enough to be clear. + +00:52:55 I want five. + +00:52:56 And I've always said that, but I'll happily take three. + +00:52:58 But it's one of these things where it's a constant struggle. + +00:53:00 And it got a little bit better before the pandemic + +00:53:03 just because everyone was spending on conferences + +00:53:05 and PyCon US is a big driver for the Python Software Foundation. + +00:53:08 And I know your Python's a driver for the European Society. + +00:53:12 But then COVID hit and conferences haven't picked back up. + +00:53:15 And then there's a whole new cohort of companies that have come in post-pandemic + +00:53:19 that have never had that experience of going to PyCon and sponsoring PyCon. + +00:53:22 And so they don't think about, I think, sponsoring PyCon + +00:53:25 the PSF because that's also a big kind of in your face, you should help sponsor this. + +00:53:29 And I think it's led to this kind of lull where offered spending has gone down, new entrants + +00:53:33 into the community have not had that experience and thought about it. And it's led to this kind + +00:53:37 of dearth where, yeah, that PSF had to stop giving out grant money. And it sucks. And I would love + +00:53:43 to see it not be that problem. I want to add one interesting data point that I discovered in + +00:53:47 short. Keep it short. Yes. NumFocus has about twice the budget of the PSF. I was shocked to + +00:53:53 discover this. So basically it is possible to get money from companies to sponsor development of + +00:54:00 Python related projects. And I don't know what they're doing that we aren't. And I think it's + +00:54:05 worth talking and figuring it out. We need a fundraiser and marketer in residence, maybe. Who + +00:54:10 knows? Lauren does a great job, to be clear. The PSF has Lauren and Lauren is that. But it's still + +00:54:18 hard. We have someone doing it full time at the PSF and it's just hard to get companies to give + +00:54:22 cash up cash. + +00:54:23 Yeah, and what do we get in return? + +00:54:25 Well, we already get that. + +00:54:26 So, yeah, I know. + +00:54:27 All right, Barry. + +00:54:28 To just, you know, shift gears into a different area, + +00:54:32 something that I've been thinking a lot over this past year on the steering council. + +00:54:36 Thomas, I'm sure, is going to be, you know, very well aware, + +00:54:39 having been instrumental in the lazy imports PEP A10. + +00:54:45 We have to sort of rethink how we evolve Python + +00:54:50 and how we pose changes to Python and how we discuss those changes in the community. + +00:54:56 Because I think one of the things that I have heard over and over and over again is that authoring PEPs + +00:55:04 is incredibly difficult and emotionally draining and it's a time sink. + +00:55:11 And leading those discussions on discuss.python.org, which we typically call DPO, + +00:55:18 can be toxic at times and very difficult. + +00:55:21 So one of the things that I realized as I was thinking about this + +00:55:25 is that peps are 25 years old now, right? + +00:55:29 So we've had this, and not only just peps are old, + +00:55:33 but like we've gone through at least two, if not more sort of complete revolutions + +00:55:38 in the way we discuss things. + +00:55:40 You know, the community has grown incredibly. + +00:55:43 The developer community is somewhat larger, but just the number of people + +00:55:47 who are using Python and who have an interest in it has grown exponentially. So it has become + +00:55:54 really difficult to evolve the language in the standard library and the interpreter. And we need + +00:56:01 to sort of think about how we can make this easier for people and not lose the voice of the user. + +00:56:09 And the number of people who actually engage in topics on DPO is the tip of the iceberg. You know, + +00:56:14 We've got millions and millions of users out there in the world who, for example, lazy imports will affect, free threading will affect and don't even know that they have a voice. + +00:56:24 And maybe we have to basically represent that, but we have to do it in a much more collaborative and positive way. + +00:56:31 That's something that I've been thinking about a lot. + +00:56:33 And whether or not I'm on the steering council next year, I think this is something that I'm going to spend some time on trying to think about, you know, talk to people about ways we can make this easier for everyone. + +00:56:43 The diversity of use cases for Python in the last couple of years. + +00:56:47 So complex. + +00:56:48 Yes, exactly. + +00:56:49 It should also be preface that Barry created the PEP process. He should have started that one. + +00:56:55 It is that old. + +00:56:57 Yeah. + +00:56:58 By the way, just so everyone knows, these are not ages jokes to be mean to Barry. + +00:57:02 We've always known Barry long enough that we know Barry's okay with us making these jokes. + +00:57:07 To be very, very clear. + +00:57:07 Also, I am almost as old as Barry, although I don't look as old as Barry. + +00:57:12 Yeah, we're all over from the same age anyways. + +00:57:15 Yeah, Barry and I have known each other for 25 years, + +00:57:18 and I've always made these jokes of him. + +00:57:21 So it is different when you know each other in person. + +00:57:26 Let's put it that way. + +00:57:28 For the PEP process, I think for a lot of people, it's not obvious how difficult the process is. + +00:57:35 I mean, it wasn't even obvious to me. + +00:57:37 I saw people avoiding writing peps multiple times, and I was upset, like on the steering council, right? + +00:57:43 I saw people making changes where I thought, this is definitely something + +00:57:47 that should have been discussed in a PEP and the discussion should be recorded in a PEP and all that. + +00:57:51 And I didn't understand why they didn't until, basically until PEP 8.10. + +00:57:56 So I did PEP 779, which was the giving free threading supported status + +00:58:02 at the start of the year. + +00:58:03 And the discussion there was, you know, sort of as expected and it's already, + +00:58:08 was already an accepted PEP. + +00:58:10 It was just the question of how does it become supported? + +00:58:12 That one wasn't too exhausting. + +00:58:14 And then we got to Lazy Imports, which was Pablo, who is another steering council member, + +00:58:20 as well as a bunch of other contributors, including me and two of my co-workers and + +00:58:24 one of my former co-workers, who had all had a lot of experience with Lazy Imports, but + +00:58:29 not necessarily as much experience with the PEP process. + +00:58:32 And Pablo took the front seat because he knew the PEP process and he's done like five PEPs + +00:58:37 in the last year or something, some ridiculous number. + +00:58:40 And he shared with us the vitriol he got for like offline for the, just the audacity of proposing + +00:58:49 something that people disagreed with or something. And that was like, this is a technical suggestion. + +00:58:54 This is not a code of conduct issue where I have received my fair share of vitriol around. + +00:59:00 This is a technical discussion. And yet he gets this, these ridiculous accusations in his mailbox. + +00:59:06 And for some reason, only the primary author gets it as well, which is just weird to me. + +00:59:12 But people are lazy. + +00:59:13 Thomas is what I think you just said. + +00:59:15 Remember, the steering council exists because Guido was the got the brunt of this for Pet 572, which was the walrus operator. + +00:59:24 Right. Which is just like this minor little syntactic thing that is kind of cool when you need it. + +00:59:31 But like just the amount of anger and negative energy and vitriol that he got over that was enough to for him to just say, I'm out, you know, and you guys figure it out. + +00:59:42 And that cannot be an acceptable way to discuss the evolution of the language. + +00:59:48 Especially since apparently now every single PEP author of any contentious or semi contentious pep. + +00:59:55 Although I have to say, Pep 810 had such broad support. + +00:59:59 It was hard to call it contentious. + +01:00:01 It's just there's a couple of very loud opinions, I guess. + +01:00:04 And I'm not saying we shouldn't listen to people. + +01:00:06 We should definitely listen to especially contrary opinions. + +01:00:11 But there has to be a limit. + +01:00:12 There has to be an acceptable way of bringing things up. + +01:00:15 There has to be an acceptable way of saying, hey, you didn't actually read the Pep. + +01:00:21 please go back and reconsider everything you said after you fully digested the things, + +01:00:27 because everything's already been addressed in the pep. + +01:00:29 It's just really hard to do this in a way that doesn't destroy the relationship with the person you're telling this, right? + +01:00:37 It's hard to tell people, hey, I'm not going to listen to you because you haven't, you know, you've done a bad job. + +01:00:44 You've chosen not to inform yourself. + +01:00:46 I think you make another really strong point, Thomas, which is that there have been changes that have been made to Python that really should have been a pep. + +01:00:55 And they aren't because people don't want to go through core developers, don't want to go through this gauntlet. + +01:01:01 And so they'll create a PR and then that. + +01:01:03 But that's also not good because then, you know, we don't have that. + +01:01:06 We don't have the right level of consideration. + +01:01:11 And you think about the way that, you know, if you're in your job and you're making a change to something in your job, you have a very close relationship to your teammates. + +01:01:20 And so you have that kind of respect and hopefully, right, like compassion and consideration. + +01:01:26 And you can have a very productive discussion about a thing and you may win some arguments and you may lose some arguments, but the team moves forward as one. + +01:01:35 And I think we've lost a bit of that in Python. + +01:01:38 So that's not great. + +01:01:40 I think society in general could use a little more civility and kindness, especially to strangers that they haven't met in forums, social media, driving, you name it. + +01:01:50 Okay, but we're not going to solve that here, I'm sure. + +01:01:54 So instead, let's do Gregory's topic. + +01:01:57 Hey, I'm going to change topics quite a bit, but I wanted to call 2025 the year of type checking and language server protocols. + +01:02:04 So many of us probably have used tools like mypy to check to see if the types line up in our code or whether or not we happen to be overriding functions correctly. + +01:02:14 And so I've used mypy for many years and loved the tool and had a great opportunity to chat with the creator of it. + +01:02:20 And I integrate that into my CI and it's really been wonderful. + +01:02:24 And I've also been using a lot of LSPs, like, for example, PyRite or PyLands. + +01:02:28 But in this year, one of the things that we've seen is, number one, Pyrefly from the team at Meta. + +01:02:34 We've also seen ty from the team at Astral. + +01:02:36 And there's another one called Zubon. + +01:02:38 And Zubon is from David Halter. + +01:02:41 David was also the person who created JEDI, which is another system in Python that helped with a lot of LSP tasks. + +01:02:48 What's interesting about all three of the tools that I just mentioned + +01:02:51 is that they're implemented in Rust, and they have taken a lot of the opportunity to make the type checker + +01:02:58 and or the LSP significantly faster. + +01:03:01 So for me, this has changed how I use the LSP or the type checker and how frequently I use it. + +01:03:07 And in my experience, it has helped me to take things that might take tens of seconds or hundreds of seconds and cut them down often to less than a second. + +01:03:17 And it's really changed the way in which I'm using a lot of the tools like ty or Pyrefly or Zubon. + +01:03:24 So I can have some more details if I'm allowed to share, Michael, but I would say 2025 is the year of type checkers and LSPs. + +01:03:31 I think given the timing, let's have people give some feedback. + +01:03:34 I personally have been using Pyrefly a ton and am a big fan of it. + +01:03:38 I don't know if I'm allowed to have an opinion that isn't Pyrefly is awesome. + +01:03:43 I mean, I'm not on the Pyrefly team, but I do regularly chat with people from the Pyrefly team. + +01:03:49 Tell people real quick what it is, Thomas. + +01:03:51 So Pyrefly is Meta's attempt at a Rust-based type checker. + +01:03:56 And so it's very similar to ty. + +01:03:58 Started basically at the same time, a little later. + +01:04:01 Meta originally had a type checker called Pyre, which was written in OCaml. + +01:04:06 They basically decided to start a rewrite in Rust. + +01:04:09 And then that really took off. + +01:04:11 And that's where we're going now. + +01:04:13 Yeah. + +01:04:14 Yeah. + +01:04:14 I don't know what I can say because I'm actually on the same team as the Pylands team. + +01:04:18 So, but no, I mean, I think it's good. + +01:04:21 I think this is one of those interesting scenarios where some people realize like, + +01:04:25 you know what, we're going to pay the penalty of writing a tool in a way that's faster, + +01:04:30 but makes us go slower because the overall win for the community + +01:04:33 is going to be a good win. + +01:04:34 So it's worth that headache, right? + +01:04:36 Not to say I don't want to scare people off from writing Rust, but let's be honest, + +01:04:39 it takes more work to write Rust code than it does take to write Python code. + +01:04:42 But some people chose to make that trade off and we're all benefiting from it. + +01:04:46 The one thing I will say that's kind of interesting from this + +01:04:48 that hasn't gotten a lot of play yet because it's still being developed, + +01:04:51 But PyLens is actually working with the Pyrefly team to define a type server protocol, TSP, so that a lot of these type servers can just kind of feed the type information to a higher level LSP and let that LSP handle the stuff like symbol renaming and all that stuff. + +01:05:05 Right. Because the key thing here and the reason there's so many different type checkers is there are there is a spec. + +01:05:12 Right. And everyone's trying to implement it. But there's differences like in type in terms of type inferencing. + +01:05:16 And if I actually go listen to Michael's interview, talk Python to me with the Pyrefly team, + +01:05:21 they actually did a nice little explanation of the difference between Pyrites approach + +01:05:25 and Pyrefly's approach. + +01:05:27 And so there's a bit of variance. + +01:05:28 But for instance, I think there's some talk now of trying to like, how do we make it + +01:05:32 so everyone doesn't have to reimplement how to rename a symbol, right? + +01:05:35 That's kind of boring. + +01:05:35 That's not where the interesting work is. + +01:05:37 And that's not performant from perspective of you want instantaneously to get that squiggly red line + +01:05:43 in whether it's VS Code or it's in PyCharm or whatever your editor is, right? + +01:05:48 You want to get it as fast as possible, but the rename- + +01:05:51 Jupyter. + +01:05:52 Jupyter. + +01:05:52 No, not Emacs. + +01:05:53 Everything but Emacs. + +01:05:53 No, not Emacs. + +01:05:56 Just to bring things full circle, it's that focus on user experience, right? + +01:06:00 Which is, yes, you want that squiggly line, but when things go wrong, + +01:06:04 when your type checker says, oh, you've got a problem, + +01:06:07 you know, like I think about as an analogy, how Pablo has done an amazing amount of work + +01:06:14 on the error reporting, right? + +01:06:15 When you get an exception and, you know, now you have a lot more clues about what is it that I actually have to change to make the tool, you know, to fix the problem, right? + +01:06:26 Like so many times years ago, you know, when people were using mypy, for example, and they'd have some complex failure of their type annotations and have absolutely no idea what to do about it. + +01:06:40 And so getting to a place where now we're not just telling people you've done it wrong, + +01:06:44 but also here's some ideas about how to fix it. + +01:06:49 I think this is a full circle here because honestly, using typing in your Python code + +01:06:54 gives a lot of context to the AI when you ask for help. + +01:06:57 If you just give it a fragment and it can't work with it. + +01:06:59 That's true. + +01:07:00 And also, if you can teach your AI agent to use the type checkers and use the LSPs, + +01:07:06 it will also generate better code for you. + +01:07:09 I think the one challenge I would add to what Barry said a moment ago is that if you're a developer and you're using, say, three or four type checkers at the same time, you also have to be careful about the fact that some of them won't flag an error that the other one will flag. + +01:07:24 So I've recently written Python programs and even built a tool with one of my students named Benedek that will automatically generate Python programs that will cause type checkers to disagree with each other. + +01:07:39 I will flag it as an error, but none of the other tools will flag it as an error. + +01:07:45 And there are also cases where the new tools will all agree with each other, but disagree with mypy. + +01:07:50 So there is a type checker conformance test suite. + +01:07:53 But I think as developers, even though it might be the year of LSP and type checker, + +01:07:57 we also have to be aware of the fact that these tools are maturing and there's still disagreement among them. + +01:08:03 and also just different philosophies when it comes to how to type check and how to infer. + +01:08:08 And so we have to think about all of those things as these tools mature and become part of our ecosystem. + +01:08:12 Yeah, Greg, that last point is important. + +01:08:14 Out of curiosity, how did the things where the type checkers disagree + +01:08:18 match up with the actual runtime behavior of Python? + +01:08:21 Was it like false positives or false negatives? + +01:08:24 That's a really good question. + +01:08:26 I'll give you more details in the show notes because we actually have it in a GitHub repository + +01:08:30 and I can share it with people. + +01:08:31 But I think some of it might simply be related to cases where mypy is more conformant to the spec, but the other new tools are not as conformant. + +01:08:43 So you can import overload from typing and then have a very overloaded function. + +01:08:48 And mypy will actually flag the fact that it's an overloaded function with multiple signatures, whereas PyRite and Pyrefly and Zubon will not actually flag that, even though they should. + +01:09:00 Another big area is optional versus not optional. + +01:09:03 Yes. + +01:09:03 Like, are you allowed to pass a thing that is an optional string when the thing accepts a string? + +01:09:09 Some stuff's like, yeah, it's probably fine. + +01:09:10 Others are like, no, no, no. + +01:09:11 This is an error that you have to do a check. + +01:09:13 And if you want to switch type checkers, you might end up with a thousand warnings that you didn't previously had because of an intentional difference of opinion on how strict to be, I think. + +01:09:23 Yeah. + +01:09:23 So you have to think about false positives and false negatives when you're willing to break the build because of a type error. + +01:09:29 All of those things are things you have to factor in. + +01:09:31 But to go quickly to this connection to AI, I know it's only recently, but the Pyrefly + +01:09:37 team actually announced that they're making Pyrefly work directly with Pydantic AI. + +01:09:42 So there's going to be an interoperability between those tools so that when you're building + +01:09:46 an AI agent using Pydantic AI, you can also then have better guarantees when you're using + +01:09:52 Pyrefly as your type checker. + +01:09:53 It makes total sense, though, because then the reasoning LLM that's at the core of the + +01:09:57 agent can actually have that information before it tries to execute the code and you don't get in that + +01:10:04 loop that they often get in. You can correct it before it runs. Yeah, really good point. I want to + +01:10:09 just sort of express my appreciation to all the people working on this typing stuff. As someone + +01:10:14 who's come from many, many years in dynamic languages, I was always like, oh, typing. Those + +01:10:27 E, I love seeing how easy it is for people to ease into it when they're in Python. + +01:10:32 It's not all or nothing. + +01:10:34 C, I love the huge number of tools. + +01:10:36 The competition in this space is really exciting. + +01:10:39 And D, guess what? + +01:10:40 It really, really does help. + +01:10:42 And I'll even add an E, which is my students who come from Java, C++, C#, and so forth + +01:10:47 feel relief. + +01:10:48 They find that without type checking, it's like doing a trapeze act without a safety + +01:10:53 net. + +01:10:53 And so they're very happy to have that typing in there, + +01:10:57 typings in there. + +01:10:58 So kudos to everyone. + +01:10:59 All right, folks, we are out of time. + +01:11:00 This could literally go for hours longer. + +01:11:04 It was a big year. + +01:11:05 It was a big year, but I think we need to just have a final word. + +01:11:10 I'll start and we'll just go around. + +01:11:12 So my final thought here is, we've talked about some things that are negatives + +01:11:17 or sort of downers or whatever here and there, but I still think it's an incredibly exciting time + +01:11:22 To be a developer, data scientist, there's so much opportunity out there. + +01:11:26 There's so many things to learn and take advantage of and stay on top of. + +01:11:29 And amazing. + +01:11:30 Every day is slightly more amazing than the previous day. + +01:11:33 So I love it. + +01:11:34 Gregory, let's go to you next. + +01:11:35 Let's go around the circle. + +01:11:35 Yeah, I wanted to give a shout out to all of the local Python conferences. + +01:11:40 I actually, on a regular basis, have attended the PyOhio conference. + +01:11:44 And it is incredible. + +01:11:45 The organizers do an absolutely amazing job. + +01:11:49 And they have it hosted on a campus, oftentimes at Ohio State or Cleveland State University. + +01:11:54 And incredibly, PyOhio is a free conference that anyone can attend with no registration fee. + +01:12:01 So Michael, on a comment that I think is really positive, wow, I'm so excited about the regional + +01:12:06 Python conferences that I've been able to attend. + +01:12:08 Thomas. + +01:12:09 Wow, I didn't expect this. + +01:12:10 So I think I want to give a shout out to new people joining the community and also joining + +01:12:16 just core developer team as triagers or it's just drive by commenters. I know we harped a little bit + +01:12:22 about people, you know, giving strong opinions and discussions, but I always look to the far future + +01:12:27 as well as the near future. And we always need new people. We need new ideas. We need new opinions. So + +01:12:33 yeah, I'm, I'm excited that there's still people joining and signing up and even when it's + +01:12:39 thankless work. So I guess I want to say thank you to people doing all the thankless work. Jodi. + +01:12:44 Yeah, I want to say this is actually really only my third year or so really in the Python community. + +01:12:52 So before that, I was just sort of on the fringes, right? + +01:12:54 And after I started advocacy, I started going to the conferences and meeting people. + +01:12:58 And I think I didn't kind of get how special the community was until I watched the Python documentary this year. + +01:13:04 And I talked to Paul about this, Paul Everett afterwards, also made fun of him for his like early 2000s fashion. + +01:13:11 but I think, yeah, like I'm a relative newcomer to this community and you've all made me feel so + +01:13:18 welcome. And I guess I want to thank all the incumbents for everything you've done to make + +01:13:24 this such a special tech community for minorities and everyone, newbies, you know, Python, + +01:13:30 Python is love. Oh, geez. How am I supposed to follow that? + +01:13:37 I think one of the interesting things that we're kind of looping on here is + +01:13:41 I think the language evolution has slowed down, but it's obviously not stopped, right? + +01:13:45 Like as Thomas pointed out, there's a lot more stuff happening behind the scenes. + +01:13:49 Lazy imports are coming, and that was a syntactic change, which apparently brings out the mean side of some people. + +01:13:55 And we've obviously got our challenges and stuff, but things are still going. + +01:13:58 We're still looking along. + +01:13:59 We're still trying to be an open, welcoming place for people like Jody and everyone else who's new coming on over + +01:14:05 and to continue to be a fun place for all of us slightly grain-beard people who have been here for a long time + +01:14:11 to make us want to stick around. + +01:14:12 I think it's just more of the same, honestly. + +01:14:15 It's all of us just continuing to do what we can to help out to keep this community being a great place. + +01:14:20 And it all just keeps going forward. + +01:14:22 And I'll just end with, if you work for a company that's not sponsored the PSF, + +01:14:26 please do so. + +01:14:26 It's rare to have, I mean, a programming language or any sort of tool + +01:14:32 where it is both really, really beneficial to your career + +01:14:36 and you get to hang out with really special, nice, interesting people. + +01:14:40 And it's easy to take all that for granted if you've been steeped in the community. + +01:14:45 I went to a conference about six months ago, a non-Python conference. + +01:14:49 And that was shocking to me to discover that all the speakers were from advertisers and sponsors. + +01:14:55 Everything was super commercialized. + +01:14:57 People were not interested in just like hanging out and sharing with each other. + +01:15:00 And it was a shock to me because I've been to basically only Python conferences for so many years. + +01:15:05 I was like, oh, that's not the norm in the industry. + +01:15:08 So we've got something really special going that not only is good for the people, but good for everyone's careers and mutually reinforcing and helping each other. + +01:15:16 And that's really fantastic. + +01:15:17 And we should appreciate that. + +01:15:19 Barry, final word. + +01:15:20 Thomas stole my thunder just a little bit, but just to tie a couple of these ideas together. + +01:15:25 Python, and you know, Brett said this, right? + +01:15:29 This is Python is the community or the community is Python. + +01:15:33 There's no company that is telling anybody what Python should be. + +01:15:38 Python is what we make it. + +01:15:40 And, you know, as folks like myself get a little older and, you know, and we have younger people + +01:15:47 coming into the community, both developers and everything else + +01:15:51 who are shaping Python into their vision. + +01:15:54 I encourage you, if you've thought about becoming a core dev, + +01:15:58 find a mentor. + +01:15:59 There are people out there that will help you. + +01:16:01 If you want to be involved in the community, the PSF, you know, reach out. + +01:16:06 There are people who will help guide you this community. You can be involved. Do not let any self-imposed limitations stop you from + +01:16:16 becoming part of the Python community in the way that you want to. And eventually run for the + +01:16:22 steering council because we need many, many, many more candidates next year. And you don't need any + +01:16:29 qualifications either because I'm a high school dropout and I never went to college or anything. + +01:16:34 And look at me. + +01:16:35 And I have a PhD and I will tell you, I did not need all that to become a Python developer + +01:16:39 because I was the Python developer before I got the PhD. + +01:16:42 I'm a bass player. + +01:16:43 So if I can do it, anybody can do it. + +01:16:47 Thank you everyone for being here. + +01:16:49 This awesome look back in the air and I really appreciate you all taking the time. + +01:16:52 Thank you, Michael. + +01:16:53 Thanks everybody. + +01:16:54 Bye everybody. + +01:16:57 This has been another episode of Talk Python To Me. + +01:17:00 Thank you to our sponsors. + +01:17:01 Be sure to check out what they're offering. + +01:17:02 It really helps support the show. + +01:17:04 Look into the future and see bugs before they make it to production. + +01:17:08 Sentry's Seer AI Code Review uses historical error and performance information at Sentry + +01:17:13 to find and flag bugs in your PRs before you even start to review them. + +01:17:18 Stop bugs before they enter your code base. + +01:17:20 Get started at talkpython.fm/seer-code-review. + +01:17:25 If you or your team needs to learn Python, we have over 270 hours of beginner and advanced courses + +01:17:30 on topics ranging from complete beginners to async code, Flask, Django, HTMX, and even LLMs. + +01:17:37 Best of all, there's no subscription in sight. + +01:17:40 Browse the catalog at talkpython.fm. + +01:17:42 And if you're not already subscribed to the show on your favorite podcast player, + +01:17:46 what are you waiting for? + +01:17:47 Just search for Python in your podcast player. + +01:17:49 We should be right at the top. + +01:17:51 If you enjoy that geeky rap song, you can download the full track. + +01:17:54 The link is actually in your podcast blur show notes. + +01:17:56 This is your host, Michael Kennedy. + +01:17:58 Thank you so much for listening. + +01:17:59 I really appreciate it. + +01:18:01 I'll see you next time. + +01:18:27 I think is the norm. + diff --git a/transcripts/532-python-2025-year-in-review.vtt b/transcripts/532-python-2025-year-in-review.vtt new file mode 100644 index 0000000..f792004 --- /dev/null +++ b/transcripts/532-python-2025-year-in-review.vtt @@ -0,0 +1,3710 @@ +WEBVTT + +00:00:00.020 --> 00:00:03.540 +Python in 2025 is a delightfully refreshing place. + +00:00:04.080 --> 00:00:05.100 +The guild's days are numbered, + +00:00:05.560 --> 00:00:07.220 +packaging is getting sharper tools, + +00:00:07.620 --> 00:00:09.200 +and the type checkers are multiplying + +00:00:09.400 --> 00:00:11.040 +like gremlins snacking after midnight. + +00:00:11.500 --> 00:00:13.720 +On this episode, we have an amazing panel + +00:00:13.920 --> 00:00:15.500 +to give us a range of perspectives + +00:00:15.940 --> 00:00:18.480 +on what mattered in 2025 in Python. + +00:00:19.040 --> 00:00:20.940 +We have Barry Warsaw, Brett Cannon, + +00:00:21.380 --> 00:00:24.160 +Gregory Kampfhammer, Jody Burchell, Reuven Lerner, + +00:00:24.580 --> 00:00:27.480 +and Thomas Worders on the show to give us their thoughts. + +00:00:28.040 --> 00:00:33.260 +This is Talk Python To Me, episode 532, recorded December 9th, 2025. + +00:00:50.460 --> 00:00:55.180 +Welcome to Talk Python To Me, the number one Python podcast for developers and data scientists. + +00:00:55.700 --> 00:00:57.100 +This is your host, Michael Kennedy. + +00:00:57.480 --> 00:01:01.020 +I'm a PSF fellow who's been coding for over 25 years. + +00:01:01.660 --> 00:01:02.780 +Let's connect on social media. + +00:01:03.100 --> 00:01:06.180 +You'll find me and Talk Python on Mastodon, BlueSky, and X. + +00:01:06.540 --> 00:01:08.360 +The social links are all in your show notes. + +00:01:09.140 --> 00:01:12.640 +You can find over 10 years of past episodes at talkpython.fm. + +00:01:12.820 --> 00:01:16.020 +And if you want to be part of the show, you can join our recording live streams. + +00:01:16.380 --> 00:01:16.860 +That's right. + +00:01:17.160 --> 00:01:20.300 +We live stream the raw uncut version of each episode on YouTube. + +00:01:20.660 --> 00:01:25.320 +Just visit talkpython.fm/youtube to see the schedule of upcoming events. + +00:01:25.530 --> 00:01:29.200 +Be sure to subscribe there and press the bell so you'll get notified anytime we're recording. + +00:01:29.940 --> 00:01:32.740 +Look into the future and see bugs before they make it to production. + +00:01:33.480 --> 00:01:38.840 +Sentry's SEER AI code review uses historical error and performance information at Sentry + +00:01:39.150 --> 00:01:43.160 +to find and flag bugs in your PRs before you even start to review them. + +00:01:43.840 --> 00:01:45.660 +Stop bugs before they enter your code base. + +00:01:46.160 --> 00:01:50.040 +Get started at talkpython.fm/seer-code-review. + +00:01:50.880 --> 00:01:58.920 +Hey, before we jump into the interview, I just want to send a little message to all the companies out there with products and services trying to reach developers. + +00:01:59.620 --> 00:02:01.180 +That is the listeners of this show. + +00:02:01.680 --> 00:02:05.500 +As we're rolling into 2026, I have a bunch of spots open. + +00:02:05.600 --> 00:02:12.360 +So please reach out to me if you're looking to sponsor a podcast or just generally sponsor things in the community. + +00:02:12.960 --> 00:02:14.500 +And you haven't necessarily considered podcasts. + +00:02:14.700 --> 00:02:19.620 +You really should reach out to me and I'll help you connect with the Talk Python audience. + +00:02:20.600 --> 00:02:27.280 +thanks everyone for listening all of 2025 and here we go into 2026 cheers hey everyone it's so + +00:02:27.370 --> 00:02:31.560 +awesome to be here with you all thanks for taking the time out of your day to be part of talk python + +00:02:31.820 --> 00:02:37.740 +for this year in review this python year in review so yeah let's just jump right into it gregory + +00:02:38.180 --> 00:02:42.500 +welcome welcome to the show welcome back to the show how you doing hi i'm an associate professor + +00:02:42.660 --> 00:02:46.559 +of computer and information science and i do research and software engineering and software + +00:02:46.580 --> 00:02:51.720 +testing. I've built a bunch of Python tools, and one of the areas we're studying now is flaky test + +00:02:51.920 --> 00:02:57.720 +cases in Python projects. I'm also really excited about teaching in a wide variety of areas. In fact, + +00:02:57.720 --> 00:03:02.960 +I use Python for operating systems classes or theory of computation classes. And one of the + +00:03:02.960 --> 00:03:08.560 +things I'm excited about is being a podcast host. I'm also a host on the Software Engineering Radio + +00:03:08.820 --> 00:03:14.540 +podcast sponsored by the IEEE Computer Society, and I've had the cool opportunity to interview a + +00:03:14.540 --> 00:03:18.460 +whole bunch of people in the Python community. So Michael, thanks for welcoming me to the show. + +00:03:18.640 --> 00:03:22.640 +Yeah, it's awesome to have you back. And we talked about FlakyTest last time. I do have to say + +00:03:23.100 --> 00:03:29.140 +your AV setup is quite good. I love the new mic and all that. Thomas, welcome. Awesome to have + +00:03:29.160 --> 00:03:34.100 +you here. Thanks for having me. I'm Thomas Wauters. I'm a longtime Python core developer, + +00:03:34.300 --> 00:03:40.320 +although not as long as one of the other guests on this podcast. I worked at Google for 17 years. + +00:03:40.460 --> 00:03:45.380 +for the last year or so I've worked at Meta. In both cases, I work on Python itself within the + +00:03:45.600 --> 00:03:50.920 +company and just deploying it internally. I've also been a board member of the PSF, although I'm not + +00:03:51.100 --> 00:03:58.020 +one right now. And I've been a steering council member for five years and currently not because + +00:03:58.280 --> 00:04:02.380 +the elections are going and I don't know what the result is going to be. But I think there's like + +00:04:02.820 --> 00:04:09.099 +five, six chance that I'll be on the steering council since we only have six candidates for + +00:04:09.120 --> 00:04:14.680 +five positions when this episode probably airs. I don't know. That's quite the contribution to the + +00:04:14.740 --> 00:04:19.299 +whole community. Thank you. I always forget this. I also got the, what is it, the Distinguished + +00:04:19.400 --> 00:04:23.780 +Service Award from the PSF this year. I should probably mention that. So yes, I have been + +00:04:24.040 --> 00:04:29.680 +recognized. No need to talk about it further. Wonderful. Wonderful. Jody, welcome back on the + +00:04:29.840 --> 00:04:34.039 +show. Awesome to catch up with you. Yeah, thanks for having me back. I am a data scientist and + +00:04:34.060 --> 00:04:39.560 +developer advocate at JetBrains working on PyCharm. And I've been a data scientist for around 10 + +00:04:39.840 --> 00:04:45.220 +years. And prior to that, I was actually a clinical psychologist. So that was my training, + +00:04:45.610 --> 00:04:49.860 +my PhD, but abandoned academia for greener pastures. Let's put it that way. + +00:04:50.190 --> 00:04:50.860 +Noah Franz-Gurig. + +00:04:53.180 --> 00:04:55.000 +Brett, hello. Good to see you. + +00:04:55.100 --> 00:05:00.200 +Hello. Yes. Yeah, let's see here. I've been at Microsoft for 10 years. I started doing, + +00:05:00.300 --> 00:05:03.080 +working on AI R&D for Python developers. + +00:05:03.880 --> 00:05:06.040 +Also keep Wazzy running for Python here + +00:05:06.380 --> 00:05:09.140 +and do a lot of internal consulting for teams outside. + +00:05:09.860 --> 00:05:11.880 +I am actually the shortest running + +00:05:12.700 --> 00:05:14.060 +core developer on this call, amazingly, + +00:05:14.200 --> 00:05:15.660 +even though I've been doing it for 22 years. + +00:05:16.060 --> 00:05:17.840 +I've also only gotten the Frank Wilson award, + +00:05:18.080 --> 00:05:18.820 +not the DSA. + +00:05:19.300 --> 00:05:20.900 +So I feel very under accomplished here + +00:05:20.900 --> 00:05:21.600 +as a core developer. + +00:05:22.040 --> 00:05:23.700 +Yeah, that's me in a nutshell. + +00:05:24.060 --> 00:05:25.900 +Otherwise, I'm still trying to catch that. + +00:05:26.940 --> 00:05:27.460 +Most quoted. + +00:05:27.900 --> 00:05:29.160 +Yeah, most quoted. + +00:05:29.220 --> 00:05:33.360 +I will say actually at work, it is in my email footer that I'm a famous Python quotationist. + +00:05:33.700 --> 00:05:35.660 +That was Anthony Shaw's suggestion, by the way. + +00:05:35.860 --> 00:05:39.600 +That was not mine, but does link to the April Fool's joke from last year. + +00:05:39.850 --> 00:05:42.920 +And I am still trying to catch Anthony Shaw, I think, on appearances on this podcast. + +00:05:43.060 --> 00:05:44.120 +Well, plus one. + +00:05:44.850 --> 00:05:46.060 +Anthony Shaw should be here, honestly. + +00:05:46.440 --> 00:05:47.540 +I mean, I put it out into Discord. + +00:05:48.520 --> 00:05:50.740 +He could have been here, but probably at an odd time. + +00:05:51.000 --> 00:05:54.820 +You used to work on VS Code a bunch on the Python aspect of VS Code. + +00:05:54.870 --> 00:05:56.800 +You recently changed roles, right? + +00:05:57.000 --> 00:05:57.520 +Not recently. + +00:05:57.560 --> 00:05:59.500 +That was, I used to be the dev manager. + +00:05:59.500 --> 00:06:00.880 +Every seven years, years ago. + +00:06:01.100 --> 00:06:02.580 +Yeah, September of 2024. + +00:06:03.530 --> 00:06:04.200 +So it's been over a year. + +00:06:04.330 --> 00:06:04.940 +But yeah, I used to be the dev manager. + +00:06:04.940 --> 00:06:06.040 +That counts as recent for me. + +00:06:06.240 --> 00:06:09.400 +Yes, I used to be the dev manager for the Python experience in VS Code. + +00:06:09.520 --> 00:06:10.340 +Okay, very cool. + +00:06:10.540 --> 00:06:11.340 +That's quite a shift. + +00:06:11.480 --> 00:06:12.820 +Yeah, it went back to being an IC basically. + +00:06:13.820 --> 00:06:16.220 +You got all, you're good at your TPS reports again now? + +00:06:17.500 --> 00:06:19.480 +Actually, I just did do my connect, so I kind of did. + +00:06:19.580 --> 00:06:19.900 +Awesome. + +00:06:20.800 --> 00:06:23.480 +Reuven, I bet you haven't filed a TPS report in at least a year. + +00:06:23.660 --> 00:06:24.420 +So yeah, I'm Reuven. + +00:06:24.600 --> 00:06:27.960 +I'm a freelance Python and Pandas trainer. + +00:06:28.460 --> 00:06:31.860 +I just celebrated this past week 30 years since going freelance. + +00:06:32.660 --> 00:06:34.340 +So I guess it's working out okay. + +00:06:35.700 --> 00:06:37.720 +We'll know at some point if I need to get a real job. + +00:06:38.020 --> 00:06:41.740 +I teach Python Pandas both at companies and on my online platform. + +00:06:41.970 --> 00:06:42.660 +I have newsletters. + +00:06:42.850 --> 00:06:47.160 +I've written books, speaking conferences, and generally try to help people improve their + +00:06:47.590 --> 00:06:51.660 +Python and Pandas fluency and confidence and have a lot of fun with this community as well + +00:06:51.660 --> 00:06:52.200 +as with the language. + +00:06:52.520 --> 00:06:53.260 +Oh, good to have you here. + +00:06:53.480 --> 00:06:55.840 +Barry, it's great to have a musician on the show. + +00:06:56.840 --> 00:06:57.200 +Thanks. + +00:06:58.340 --> 00:06:59.780 +Yeah, I've got my bases over here. + +00:07:00.060 --> 00:07:02.200 +So, you know, if you need to be serenaded. + +00:07:02.740 --> 00:07:05.280 +Yeah, like a Zen of Python may break out at any moment. + +00:07:05.380 --> 00:07:06.640 +You never know when it's going to happen. + +00:07:07.000 --> 00:07:07.880 +Thanks for having me here. + +00:07:08.160 --> 00:07:12.140 +Yeah, I've been a core developer for a long time, since 1994. + +00:07:13.840 --> 00:07:19.240 +And I've been, you know, in the early days, I did tons of stuff for Python.org. + +00:07:20.000 --> 00:07:27.060 +I worked with Guido at CNRI and we moved everything from the mailing, the Postmaster stuff, + +00:07:27.580 --> 00:07:32.480 +and the version control systems back in the day, websites, all that kind of stuff. + +00:07:32.560 --> 00:07:34.720 +I try to not do any of those things anymore. + +00:07:35.500 --> 00:07:38.500 +There's way more competent people doing that stuff now. + +00:07:39.120 --> 00:07:41.120 +I have been a release manager. + +00:07:42.920 --> 00:07:46.860 +I'm currently back on the steering council and running again. + +00:07:47.940 --> 00:07:52.220 +between Thomas and I, we'll see who makes it to six years, I guess. + +00:07:52.880 --> 00:07:55.460 +And I'm currently working for NVIDIA, + +00:07:56.050 --> 00:07:57.780 +and I do all Python stuff. + +00:07:58.680 --> 00:08:01.860 +Some half and half, roughly, of internal things + +00:08:02.120 --> 00:08:05.140 +and external open source community work, + +00:08:05.600 --> 00:08:08.160 +both in packaging and in core Python. + +00:08:08.700 --> 00:08:10.800 +That's, I guess, I think that's about it. + +00:08:10.920 --> 00:08:14.660 +Yeah, you all are living in exciting tech spaces, that's for sure. + +00:08:14.900 --> 00:08:15.520 +That's for sure. + +00:08:15.520 --> 00:08:15.960 +For sure. + +00:08:16.160 --> 00:08:20.780 +Yeah. Well, great to have you all back on the show. Let's start with our first topic. So the + +00:08:20.850 --> 00:08:26.740 +idea is we've each picked at least a thing that we think stood out in 2025 in the Python space + +00:08:27.580 --> 00:08:33.200 +that we can focus on. And let's go with Jody first. I'm excited to hear what you thought was + +00:08:33.360 --> 00:08:40.080 +one of the bigger things. I'm going to mention AI. Like, wow, what a big surprise. So to kind of + +00:08:40.140 --> 00:08:45.360 +give context of where I'm coming from, I've been working in NLP for a long time. I like to say I was + +00:08:45.260 --> 00:08:50.180 +working on LLMs before they were cool. So sort of playing around with the very first releases from + +00:08:50.320 --> 00:08:56.400 +Google in like 2019, incorporating that into search. So I've been very interested sort of + +00:08:56.620 --> 00:09:03.160 +seeing the unfolding of the GPT models as they've grown. And let's say slightly disgusted by the + +00:09:03.520 --> 00:09:09.519 +discourse around the models as they become more mainstream, more sort of the talk about people's + +00:09:09.540 --> 00:09:15.200 +jobs being replaced, a lot of the hysteria, a lot of the doomsday stuff. So I've been doing talks + +00:09:15.310 --> 00:09:19.820 +and other content for around two and a half years now, just trying to cut through the hype a bit, + +00:09:19.900 --> 00:09:23.660 +being like, you know, they're just language models, they're good for language tasks. Let's think about + +00:09:23.860 --> 00:09:29.360 +realistically what they're about. And what was very interesting for me this year, I've been + +00:09:29.620 --> 00:09:34.759 +incorrectly predicting the bubble bursting for about two and a half years. So I was quite vindicated + +00:09:34.840 --> 00:09:37.600 +when in August, GPT-5 came out, + +00:09:38.030 --> 00:09:40.420 +and all of a sudden, everyone else started saying, + +00:09:40.500 --> 00:09:41.500 +maybe this is a bubble. + +00:09:41.740 --> 00:09:43.800 +Don't you think that was the first big release + +00:09:44.060 --> 00:09:46.940 +that was kind of a letdown compared to what the hype was? + +00:09:47.100 --> 00:09:48.200 +Yeah, and it was really interesting. + +00:09:48.560 --> 00:09:50.980 +So I found this really nice Atlantic article, + +00:09:51.110 --> 00:09:52.540 +and I didn't save it, unfortunately, + +00:09:52.960 --> 00:09:56.140 +but essentially it told sort of the whole story + +00:09:56.350 --> 00:09:57.760 +of what was going on behind the scenes. + +00:09:58.000 --> 00:10:01.820 +So GPT-4 came out in March of 2023, + +00:10:02.390 --> 00:10:03.939 +and that was the model that came out + +00:10:03.960 --> 00:10:08.520 +with this Microsoft research paper saying, you know, sparks of AGI, artificial general intelligence, + +00:10:08.660 --> 00:10:14.740 +blah, blah, blah. And from that point, there was really this big expectation sort of fueled by + +00:10:15.220 --> 00:10:21.460 +OpenAI that GPT-5 was going to be the AGI model. And it turns out what was happening internally + +00:10:22.080 --> 00:10:27.300 +is these scaling laws that were sort of considered, you know, this exponential growth thing that would + +00:10:27.620 --> 00:10:33.199 +sort of push the power of these models perhaps towards human-like performance. They weren't + +00:10:33.220 --> 00:10:38.200 +laws at all. And of course they started failing. So the model that they had originally pitched as + +00:10:38.320 --> 00:10:42.780 +GPT-4 just didn't live up to performance. They started this post-training stuff where they were + +00:10:43.200 --> 00:10:48.160 +going more into like specialized reasoning models. And what we have now are good models that are good + +00:10:48.300 --> 00:10:54.400 +for specific tasks, but I don't know what happened, but eventually they had to put the GPT-5 label on + +00:10:54.560 --> 00:11:01.959 +something. And yeah, let's say it didn't live up to expectations. So I think the cracks are starting + +00:11:01.980 --> 00:11:08.620 +to show because the underlying expectation always was this will be improving to the point where + +00:11:08.960 --> 00:11:14.200 +anything's possible and you can't put a price on that. But it turns out that if maybe there's a + +00:11:14.320 --> 00:11:19.180 +limit on what's possible, yeah, you can put a price on it. And a lot of the valuations are on the + +00:11:19.660 --> 00:11:23.780 +first part. Yes. And it's always been a bit interesting to me because I come from a scientific + +00:11:24.000 --> 00:11:28.459 +background and you need to know how to measure stuff, right? And I'm like, what are you trying + +00:11:28.480 --> 00:11:34.120 +to achieve? Like Gregory's nodding, like, please jump in. I'm on my monologue, so please don't + +00:11:34.380 --> 00:11:38.700 +interrupt me. You really need to understand what you're actually trying to get these models to do. + +00:11:38.920 --> 00:11:46.180 +What is AGI? No one knows this. And what's going to be possible with this? And it's more science + +00:11:46.420 --> 00:11:51.580 +fiction than fact. So this for me has been the big news this year, and I'm feeling slightly smug, + +00:11:51.800 --> 00:11:55.180 +I'm going to be honest, even though my predictions were off by about a year and a half. + +00:11:55.320 --> 00:11:56.720 +Yeah, maybe it's not an exponential curve. + +00:11:56.920 --> 00:11:59.580 +It's a titration S curve with an asymptote. + +00:11:59.650 --> 00:11:59.880 +We'll see. + +00:12:00.020 --> 00:12:01.320 +Yeah, sigmoid. + +00:12:01.820 --> 00:12:01.940 +Yeah. + +00:12:02.200 --> 00:12:03.020 +Yeah, yeah, yeah. + +00:12:03.320 --> 00:12:06.740 +I mean, I think we have to sort of separate the technology from the business. + +00:12:07.310 --> 00:12:12.780 +And the technology, even if it doesn't get any better, even if we stay with what we have today, + +00:12:13.260 --> 00:12:17.720 +I still think this is like one of the most amazing technologies I've ever seen. + +00:12:18.180 --> 00:12:19.200 +It's not a god. + +00:12:19.520 --> 00:12:20.820 +It's not a panacea. + +00:12:21.280 --> 00:12:25.200 +But it's like a chainsaw that if you know how to use it, it's really effective. + +00:12:25.760 --> 00:12:31.980 +but in the hands of amateurs, you can really get hurt. And so, yes, it's great to see this sort of + +00:12:32.100 --> 00:12:36.260 +thing happening and improving, but who knows where it's going to go. And I'm a little skeptical of + +00:12:36.260 --> 00:12:40.280 +the AGI thing. What I'm a little more worried about is that these companies seem to have no + +00:12:40.720 --> 00:12:47.260 +possible way of ever making the money that they're promising to their investors. And I do worry a lot + +00:12:47.700 --> 00:12:53.580 +that we're sort of like a year 2000 situation where, yeah, the technology is fantastic, + +00:12:53.660 --> 00:13:03.400 +But the businesses are unsustainable. And out of the ashes of what will happen, we will get some amazing technology and even better than we had before. But there are going to be ashes. + +00:13:03.520 --> 00:13:09.500 +For me, that also makes me worry. And I don't know if anyone reads Ed Zitron here. He's a + +00:13:09.750 --> 00:13:15.020 +journalist kind of digging into the state of the AI industry. He does get a bit sort of, + +00:13:15.340 --> 00:13:19.480 +his reputation is a bit of a crank now. So I think he's leaned into that pretty hard, + +00:13:20.020 --> 00:13:24.880 +but he does take the time to also pull out numbers and point out things that don't make sense. + +00:13:25.320 --> 00:13:29.920 +And he was one of the first ones to sound the whistle on this circular funding we've been seeing. + +00:13:30.160 --> 00:13:40.860 +So the worry, of course, is when a lot of this becomes borrowings from banks and then that starts dragging in funding from everyday people. + +00:13:41.400 --> 00:13:45.720 +And also the effect that this has had on particularly the U.S. economy, like the stock market. + +00:13:45.980 --> 00:13:54.380 +I think the investment in AI spending now exceeds consumer spending in the U.S., which is a really scary prospect. + +00:13:54.860 --> 00:13:55.600 +That is crazy. + +00:13:55.700 --> 00:14:02.080 +Mm-hmm. But yeah, also as Reven said, I love LLMs. They are the most powerful tools we've + +00:14:02.090 --> 00:14:06.060 +ever had for natural language processing. It's phenomenal the problems we can solve with them + +00:14:06.140 --> 00:14:10.100 +now. I didn't think this sort of stuff would be possible when I started in data science. + +00:14:10.710 --> 00:14:15.560 +I still think there's a use case for agents, although I do think they've been a bit overstated, + +00:14:16.120 --> 00:14:20.060 +especially now that I'm building them. Let's say it's not very fun building + +00:14:20.150 --> 00:14:25.099 +non-deterministic software. It's quite frustrating, actually. But I hope we're going to see improvements + +00:14:25.100 --> 00:14:30.540 +in the framework, particularly I've heard good things about Pydantic AI. And yeah, hopefully we + +00:14:30.640 --> 00:14:36.080 +can control the input outputs and make them a bit more strict. This will fix a lot of the problems. + +00:14:36.340 --> 00:14:41.220 +One thing I do want to put out in this conversation, I think is worth separating. And Reuven, + +00:14:41.220 --> 00:14:45.260 +you touched on this some. I want to suggest to you, I'll throw this out to you all and see what + +00:14:45.260 --> 00:14:51.599 +you think. I think it's very possible that this AI bubble crashes the economy and causes bad things + +00:14:51.620 --> 00:14:57.780 +economically to happen and a bunch of companies that are like wrappers over open ai api go away + +00:14:58.300 --> 00:15:03.800 +but i don't think things like the agentic coding tools will vanish they might stop training they + +00:15:03.920 --> 00:15:08.880 +might slow their advance because that's super expensive but i even as if you said even if we + +00:15:09.160 --> 00:15:16.000 +just had claude sonnet 4 and the world never got something else it would be so much far farther + +00:15:16.150 --> 00:15:20.980 +beyond autocomplete and the other stuff that we had before and stack overflow that it's i don't + +00:15:20.860 --> 00:15:23.320 +think it's going to go. The reason I'm throwing this out there is I was talking to somebody and + +00:15:23.340 --> 00:15:26.160 +they were like, well, I don't think it's worth learning because I think the bubble is going to + +00:15:26.520 --> 00:15:30.020 +pop. And so I don't want to learn this agent at coding because it won't be around very long. + +00:15:30.340 --> 00:15:34.680 +What do you all think? It's here to stay. I think it's just, where's the limit? Where does it stop? + +00:15:34.940 --> 00:15:40.340 +I think that's the big open question for everybody, right? Like pragmatically, it's a tool. It's useful + +00:15:40.410 --> 00:15:44.100 +in some scenarios and not in others. And you just have to learn how to use the tool appropriately + +00:15:44.300 --> 00:15:47.939 +for your use case and to get what you need out of it. And sometimes that's not using it because + +00:15:47.960 --> 00:15:51.540 +it's just going to take up more time than it will to be productive. But other times it's + +00:15:51.960 --> 00:15:56.440 +fully juices up your productivity and you can get more done. It's give and take. But I don't think + +00:15:56.500 --> 00:16:00.100 +it's going to go anywhere because as you said, Michael, there's even academics doing research now. + +00:16:00.300 --> 00:16:05.160 +There's open weight models as well. There's a lot of different ways to run this, whether you're + +00:16:05.520 --> 00:16:11.280 +at the scale of the frontier models that are doing these huge trainings or you're doing something + +00:16:11.740 --> 00:16:16.140 +local and more specialized. So I think the general use of AI isn't going anywhere. I think it's just + +00:16:16.080 --> 00:16:21.980 +the question of how far can this current trend go and where will it be i want to say stop that's + +00:16:22.260 --> 00:16:25.580 +because that plays into the whole it's never it's going to completely go away i don't think it ever + +00:16:25.680 --> 00:16:29.160 +will i think it's just going to be where where are we going to start to potentially bump up against + +00:16:29.280 --> 00:16:33.540 +limits one thing that i'll say is that many of these systems are almost to me like a dream come + +00:16:33.750 --> 00:16:38.640 +true now admittedly it's the case that the systems i'm building are maybe only tens of thousands of + +00:16:38.700 --> 00:16:44.019 +lines or hundreds of thousands of lines but i can remember thinking to myself how cool would it be + +00:16:44.080 --> 00:16:46.880 +if I had a system that could automatically refactor + +00:16:47.440 --> 00:16:51.000 +and then add test cases and increase the code coverage + +00:16:51.380 --> 00:16:54.140 +and make sure all my checkers and linters pass + +00:16:54.440 --> 00:16:57.320 +and do that automatically and continue the process + +00:16:57.650 --> 00:16:58.920 +until it achieved its goal. + +00:16:59.260 --> 00:17:01.620 +And I remember thinking that five to seven years ago, + +00:17:01.750 --> 00:17:04.680 +I would never realize that goal in my entire lifetime. + +00:17:05.189 --> 00:17:07.819 +And now when I use like anthropics models + +00:17:08.010 --> 00:17:09.839 +through open code or Claude code, + +00:17:10.220 --> 00:17:13.120 +it's incredible how much you can achieve so quickly, + +00:17:13.459 --> 00:17:16.480 +even for systems that are of medium to moderate scale. + +00:17:17.040 --> 00:17:19.860 +So from my vantage point, it is a really exciting tool. + +00:17:20.260 --> 00:17:21.420 +It's incredibly powerful. + +00:17:21.870 --> 00:17:24.339 +And what I have found is that the LLMs are much better + +00:17:24.640 --> 00:17:26.500 +when I teach them how to use tools + +00:17:27.040 --> 00:17:28.900 +and the tools that it's using + +00:17:29.260 --> 00:17:31.220 +are actually really quick, fast ones + +00:17:31.660 --> 00:17:33.920 +that can give rapid feedback to the LLM + +00:17:34.240 --> 00:17:35.380 +and tell it whether it's moving + +00:17:35.450 --> 00:17:36.540 +in the right direction or not. + +00:17:36.680 --> 00:17:38.900 +Yeah, there's an engineering angle to this. + +00:17:39.040 --> 00:17:40.180 +It's not just Vibe Coding + +00:17:40.340 --> 00:17:42.700 +if you take the time to learn it. + +00:17:42.860 --> 00:17:48.040 +There was actually a very interesting study. I don't think the study itself has been released. + +00:17:48.060 --> 00:17:53.340 +I haven't found it yet, but I saw a talk on it by some guys at Stanford. So they call it the 10K + +00:17:53.580 --> 00:17:59.160 +developer study. And basically what they were doing was studying real code bases, including, + +00:17:59.260 --> 00:18:05.260 +I think 80% of them were actually private code bases and seeing the point where the team started + +00:18:05.500 --> 00:18:10.699 +adopting AI. And so their findings are really interesting and nuanced. And I think they probably + +00:18:10.720 --> 00:18:16.080 +intuitively align with what a lot of us have experienced with AI. So basically, yes, there + +00:18:16.080 --> 00:18:21.720 +are productivity boosts, but it produces a lot of code, but the code tends to be worse than the code + +00:18:21.720 --> 00:18:27.400 +you would write and also introduces more bugs. So when you account for the time that you spend + +00:18:27.540 --> 00:18:32.260 +refactoring and debugging, you're still more productive. But then it also depends on the + +00:18:32.380 --> 00:18:36.500 +type of project, as Gregory was saying. So it's better for greenfield projects, it's better for + +00:18:36.520 --> 00:18:41.380 +smaller code bases. It's better for simpler problems and it's better for more popular languages because + +00:18:41.540 --> 00:18:46.200 +obviously there's more training data. And so this was actually, I like this study so much. I'll + +00:18:46.260 --> 00:18:49.900 +actually share it with you, Michael, if you want to put it in the show notes, but it shows that, + +00:18:50.000 --> 00:18:53.940 +yeah, the picture is not that simple and all this conflicting information and conflicting experiences + +00:18:54.180 --> 00:19:00.400 +people were having line up completely with this. So again, like I work at an IDE company, it's tools + +00:19:00.540 --> 00:19:06.480 +for the job. It's not like your IDE will replace you. AI is not going to replace you. It's just + +00:19:06.500 --> 00:19:08.200 +going to make you maybe more productive sometimes. + +00:19:08.700 --> 00:19:08.880 +Yeah. + +00:19:09.080 --> 00:19:10.400 +Wait, IDE, you work for me. + +00:19:11.600 --> 00:19:11.700 +Right. + +00:19:12.500 --> 00:19:13.000 +It's not about you. + +00:19:13.000 --> 00:19:14.400 +But then I work for the IDE. + +00:19:18.360 --> 00:19:21.320 +This portion of Talk Python To Me is brought to you by Sentry. + +00:19:22.640 --> 00:19:23.500 +Let me ask you a question. + +00:19:24.280 --> 00:19:25.880 +What if you could see into the future? + +00:19:26.640 --> 00:19:27.900 +We're talking about Sentry, of course. + +00:19:28.160 --> 00:19:33.460 +So that means seeing potential errors, crashes, and bugs before they happen, before you even + +00:19:33.780 --> 00:19:34.720 +accept them into your code base. + +00:19:35.380 --> 00:19:38.500 +That's what Sentry's AI Sears Code Review offers. + +00:19:39.300 --> 00:19:42.640 +You get error prediction based on real production history. + +00:19:43.180 --> 00:19:49.620 +AI Sear Code Review flags the most impactful errors your PR is likely to introduce before merge + +00:19:50.180 --> 00:19:54.980 +using your app's error and performance context, not just generic LLM pattern matching. + +00:19:55.800 --> 00:20:00.880 +Sear will then jump in on new PRs with feedback and warning if it finds any potential issues. + +00:20:01.680 --> 00:20:02.460 +Here's a real example. + +00:20:03.240 --> 00:20:10.560 +On a new PR related to a search feature in a web app, we see a comment from seer bicenturybot in the PR. + +00:20:11.220 --> 00:20:18.820 +And it says, potential bug, the process search results function, can enter an infinite recursion when a search query finds no matches. + +00:20:19.620 --> 00:20:23.800 +As the recursive call lacks a return statement and a proper termination condition. + +00:20:24.360 --> 00:20:32.000 +And Seer AI Code Review also provides additional details which you can expand for further information on the issue and suggested fixes. + +00:20:32.760 --> 00:20:37.780 +And bam, just like that, Seer AI Code Review has stopped a bug in its tracks without any + +00:20:37.980 --> 00:20:38.560 +devs in the loop. + +00:20:38.880 --> 00:20:41.720 +A nasty infinite loop bug never made it into production. + +00:20:42.440 --> 00:20:43.200 +Here's how you set it up. + +00:20:43.620 --> 00:20:48.940 +You enable the GitHub Sentry integration on your Sentry account, enable Seer AI on your + +00:20:49.180 --> 00:20:53.840 +Sentry account, and on GitHub, you install the Seer by Sentry app and connect it to your + +00:20:53.960 --> 00:20:55.600 +repositories that you want it to validate. + +00:20:55.940 --> 00:20:58.800 +So jump over to Sentry and set up Code Review for yourself. + +00:20:59.320 --> 00:21:03.420 +Just visit talkpython.fm/seer-code-review. + +00:21:03.650 --> 00:21:05.340 +The link is in your podcast player show notes + +00:21:05.580 --> 00:21:06.460 +and on the episode page. + +00:21:06.980 --> 00:21:09.260 +Thank you to Sentry for supporting Talk Python and me. + +00:21:10.140 --> 00:21:12.000 +I mean, the other thing is a lot of people + +00:21:12.440 --> 00:21:15.060 +and a lot of the sort of when people talk about AI + +00:21:15.400 --> 00:21:17.280 +and LLMs and so forth in context of coding, + +00:21:17.800 --> 00:21:20.500 +it's the LLM writing code for us. + +00:21:20.920 --> 00:21:23.440 +And maybe because I'm not doing a lot of serious coding, + +00:21:23.600 --> 00:21:25.060 +it's more instruction and so forth. + +00:21:25.320 --> 00:21:29.280 +I use it as like a sparring or brainstorming partner + +00:21:29.620 --> 00:21:36.800 +So it does, you know, checking of my newsletters for language and for tech edits and just sort of exploring ideas. + +00:21:37.300 --> 00:21:43.320 +And for that, maybe it's because I do everything in the last minute and I don't have other people around or I'm lazy or cheap and don't want to pay them. + +00:21:43.800 --> 00:21:46.620 +But definitely the quality of my work has improved dramatically. + +00:21:46.860 --> 00:21:50.680 +The quality of my understanding has improved, even if it never wrote a line of code for me. + +00:21:50.920 --> 00:21:53.860 +Just getting that feedback on a regular automatic basis is really helpful. + +00:21:54.120 --> 00:21:55.140 +Yeah, I totally agree with you. + +00:21:55.360 --> 00:22:05.520 +All right. We don't want to spend too much time on this topic, even though I believe Jody has put her finger on what might be the biggest tidal wave of 2025. + +00:22:06.040 --> 00:22:08.560 +But still, a quick parting thoughts. Anyone else? + +00:22:08.900 --> 00:22:11.220 +I'm glad I'll never have the right bash from scratch ever again. + +00:22:12.960 --> 00:22:13.640 +Tell me about it. + +00:22:13.740 --> 00:22:14.040 +Yeah. + +00:22:15.800 --> 00:22:21.580 +I'll just say from anecdotally, the thing that I love about it is when I need to do something + +00:22:22.020 --> 00:22:28.220 +and I need to go through docs, online docs for whatever it is, you know, it might be + +00:22:28.380 --> 00:22:31.900 +GitLab or some library that I want to use or something like that. + +00:22:32.150 --> 00:22:33.800 +I never even search for the docs. + +00:22:33.810 --> 00:22:35.680 +I never even try to read the docs anymore. + +00:22:35.810 --> 00:22:40.940 +I just say, hey, you know, whatever model I need to set up this website. + +00:22:41.300 --> 00:22:43.720 +And I just, just tell me what to do or just do it. + +00:22:43.960 --> 00:22:47.460 +And it's an immense time saver and productivity. + +00:22:47.680 --> 00:22:52.120 +And then it gets me bootstrapped to the place where now I can start to be creative. + +00:22:52.350 --> 00:22:57.800 +I don't have to worry about just like digging through pages and pages and pages of docs to + +00:22:57.950 --> 00:23:00.000 +figure out one little setting here or there. + +00:23:00.320 --> 00:23:02.040 +That's an amazing time saver. + +00:23:02.120 --> 00:23:03.220 +Yeah, that's a really good point. + +00:23:03.500 --> 00:23:07.160 +Another thing that I have noticed, there might be many things for which I had a really good + +00:23:07.390 --> 00:23:10.420 +mental model, but my brain can only store so much information. + +00:23:10.960 --> 00:23:16.000 +So for example, I know lots about the abstract syntax tree for Python, but I forget that + +00:23:16.150 --> 00:23:16.280 +sometimes. + +00:23:16.870 --> 00:23:21.280 +And so it's really nice for me to be able to bring that back into my mind quickly with + +00:23:21.280 --> 00:23:21.960 +an LLM. + +00:23:22.250 --> 00:23:26.940 +And if it's generating code for me that's doing a type of AST parsing, I can tell whether + +00:23:27.080 --> 00:23:30.380 +that's good code or not because I can refresh that mental model. + +00:23:30.740 --> 00:23:34.960 +So in those situations, it's not only the docs, but it's something that I used to know + +00:23:35.070 --> 00:23:37.600 +really well that I have forgotten some of. + +00:23:37.940 --> 00:23:44.180 +And the LLM often is very powerful when it comes to refreshing my memory and helping me to get started and move more quickly. + +00:23:44.320 --> 00:23:48.680 +All right. Out of time, I think. Let's move on to Brett. What do you got, Brett? + +00:23:48.920 --> 00:23:56.020 +Well, I actually originally said we should talk about AI, but Jody had a way better pitch for it than I did because my internal pitch was a little bit AI. + +00:23:56.360 --> 00:24:01.720 +Do I actually have to write a paragraph explaining why? Then Jody actually did write the paragraph. So she did a much better job than I did. + +00:24:01.940 --> 00:24:06.320 +So the other topic I had was using tools to run your Python code. + +00:24:06.680 --> 00:24:09.060 +And what I mean by that is traditionally, if you think about it, + +00:24:10.120 --> 00:24:11.760 +you install the Python interpreter, right? + +00:24:13.480 --> 00:24:15.840 +Hopefully you create a virtual environment, install your dependencies, + +00:24:16.140 --> 00:24:18.940 +and you call the Python interpreter in your virtual environment to run your code. + +00:24:19.560 --> 00:24:21.020 +Those are all the steps you went through to run stuff. + +00:24:21.360 --> 00:24:25.460 +But now we've got tools that will compress all that into a run command, + +00:24:25.920 --> 00:24:26.640 +just do it all for you. + +00:24:26.960 --> 00:24:31.520 +And it seems like the community has shown a level of comfort with that, + +00:24:31.940 --> 00:24:34.380 +that I'd say snuck up on me a little bit, + +00:24:34.680 --> 00:24:37.620 +but I would say that I think it's a good thing, right? + +00:24:37.780 --> 00:24:40.200 +It's showing us, I'm going to say us, + +00:24:40.370 --> 00:24:43.060 +as the junior core developer here on this call, + +00:24:43.440 --> 00:24:45.700 +as to, sorry to make you too feel old, + +00:24:45.840 --> 00:24:48.680 +but admittedly, Barry did write my letter of recommendation + +00:24:48.910 --> 00:24:49.940 +to my master's program. + +00:24:51.260 --> 00:24:52.800 +So what happened was like, yeah, + +00:24:53.100 --> 00:24:54.740 +we had Hatch and PDM, + +00:24:55.290 --> 00:24:58.180 +poetry before that, and uv as of last year, + +00:24:58.680 --> 00:24:59.360 +all kind of come through + +00:24:59.720 --> 00:25:00.880 +and all kind of build on each other + +00:25:01.200 --> 00:25:02.100 +and take ideas from each other + +00:25:02.240 --> 00:25:03.680 +and kind of just slowly build up + +00:25:03.680 --> 00:25:05.940 +this kind of repertoire of tool approaches + +00:25:06.260 --> 00:25:08.580 +that they all kind of have a baseline kind of, + +00:25:08.900 --> 00:25:09.740 +not synergy is the right word, + +00:25:09.960 --> 00:25:12.720 +but share just kind of approach to certain things + +00:25:13.220 --> 00:25:15.640 +with their own twists and added takes on things. + +00:25:15.790 --> 00:25:17.300 +But in general, this whole like, + +00:25:17.580 --> 00:25:19.300 +you know what, you can just tell us to run this code + +00:25:19.400 --> 00:25:20.440 +and we will just run it, right? + +00:25:20.540 --> 00:25:22.160 +Like inline script metadata coming in + +00:25:22.260 --> 00:25:23.980 +and help making that more of a thing. + +00:25:24.380 --> 00:25:26.860 +Disclaimer, I was the PEP delegate for getting that in. + +00:25:27.140 --> 00:25:29.400 +But I just think that's been a really awesome trend + +00:25:29.420 --> 00:25:33.900 +And I'm hoping we can kind of leverage that a bit. + +00:25:34.190 --> 00:25:36.660 +Like I have personal plans that we don't need to go into here, + +00:25:36.820 --> 00:25:38.820 +but like I'm hoping as a Python core team, + +00:25:38.870 --> 00:25:41.020 +we can kind of like help boost this stuff up a bit + +00:25:41.020 --> 00:25:43.300 +and kind of help keep a good baseline for this for everyone. + +00:25:43.350 --> 00:25:45.700 +Because I think it's shown that Python is still really good for beginners. + +00:25:45.970 --> 00:25:48.900 +You just have to give them the tools to kind of hide some of the details + +00:25:49.280 --> 00:25:51.800 +to not shoot yourself in the foot and still leads to a great outcome. + +00:25:52.000 --> 00:25:55.640 +Yeah, 2025 might be the year that the Python tools stepped outside of Python. + +00:25:56.180 --> 00:25:59.740 +Instead of being, you install Python and then use the tools. + +00:26:00.190 --> 00:26:01.920 +You do the tool to get Python, right? + +00:26:02.140 --> 00:26:03.660 +Like uv and PDM and others. + +00:26:03.670 --> 00:26:07.880 +Yeah, and inverted the dependency graph in terms of just how you put yourself in, right? + +00:26:08.040 --> 00:26:13.260 +I think the interesting thing is these tools treat Python as an implementation detail almost, right? + +00:26:13.520 --> 00:26:17.480 +Like when you just say uv or hatch run or PDM run thing, + +00:26:17.920 --> 00:26:19.520 +these tools don't make you have to think about the interpreter. + +00:26:19.740 --> 00:26:22.480 +It's just a thing that they pull in to make your code run, right? + +00:26:22.600 --> 00:26:25.940 +It's not even necessarily something you have to care about if you choose not to. + +00:26:26.100 --> 00:26:29.940 +And it's an interesting shift in that perspective, at least for me. + +00:26:30.050 --> 00:26:31.360 +But I've also been doing this for a long time. + +00:26:31.360 --> 00:26:33.080 +I think you're really onto something. + +00:26:33.460 --> 00:26:39.480 +And what I love at sort of a high level is this, I think there's a renewed focus on the user experience. + +00:26:40.320 --> 00:26:49.080 +And like uv plus the PEP 723, the inline metadata, you know, you can put uv in the shebang line of your script. + +00:26:49.560 --> 00:26:51.760 +And now you don't have to think about anything. + +00:26:52.120 --> 00:26:56.340 +You get uv from somewhere, and then it takes care of everything. + +00:26:57.790 --> 00:27:00.800 +And Hatch can work the same way, I think, for developers. + +00:27:01.100 --> 00:27:08.140 +But this renewed focus on installing your Python executable, + +00:27:08.340 --> 00:27:11.880 +you don't really have to think about, because those things are very complicated, + +00:27:12.030 --> 00:27:14.420 +and people just want to hit the ground running. + +00:27:14.950 --> 00:27:17.600 +And so if you think about the previous discussion about AI, + +00:27:18.260 --> 00:27:19.880 +I just want things to work. + +00:27:20.140 --> 00:27:21.760 +I know what I want to do. + +00:27:22.220 --> 00:27:23.140 +I can see it. + +00:27:23.140 --> 00:27:24.300 +I can see the vision of it. + +00:27:24.300 --> 00:27:25.640 +And I just don't want to. + +00:27:25.960 --> 00:27:28.740 +An analogy is like when I first learned Python + +00:27:28.960 --> 00:27:31.580 +and I came from C++ and all those languages. + +00:27:32.280 --> 00:27:35.140 +And I thought, oh my gosh, just to get like, hello world, + +00:27:35.280 --> 00:27:39.420 +I have to do a million little things that I shouldn't have to do. + +00:27:39.680 --> 00:27:43.120 +Like create a main and get my braces right + +00:27:43.140 --> 00:27:46.760 +and get all my variables right and get my pound includes correct. + +00:27:46.920 --> 00:27:48.980 +And now I don't have to think about any of that stuff. + +00:27:49.260 --> 00:27:59.980 +And the thing that was eye-opening for me with Python was the distance between vision of what I wanted and working code just really narrowed. + +00:28:00.100 --> 00:28:09.320 +And I think that as we are starting to think about tools and environments and how to bootstrap all this stuff, we're also now taking all that stuff away. + +00:28:09.840 --> 00:28:11.580 +Because people honestly don't care. + +00:28:11.650 --> 00:28:13.320 +I don't care about any of that stuff. + +00:28:13.330 --> 00:28:18.020 +I just want to go from like, I woke up this morning and had a cool idea and I just wanted to get at work. + +00:28:18.860 --> 00:28:22.080 +Or you wanted to share it so you could just share the script and you don't have to say, + +00:28:22.460 --> 00:28:24.300 +here's your steps that you get started with. + +00:28:24.980 --> 00:28:25.300 +Exactly. + +00:28:25.800 --> 00:28:26.020 +Exactly. + +00:28:26.320 --> 00:28:28.660 +I want to thank the two of you for, oh, sorry, sorry, go ahead. + +00:28:28.960 --> 00:28:34.220 +I'm just going to say, like, for years teaching Python that how do we get it installed? + +00:28:34.720 --> 00:28:37.580 +At first, it surprised me how difficult it was for people. + +00:28:37.700 --> 00:28:39.260 +Because like, oh, come on, we just got Python. + +00:28:39.460 --> 00:28:40.380 +Like, what's so hard about this? + +00:28:40.620 --> 00:28:44.740 +But it turns out it's a really big barrier to entry for newcomers. + +00:28:45.200 --> 00:28:48.920 +And I'm very happy that Jupyter Lite now has solved its problems with input. + +00:28:49.320 --> 00:28:50.520 +And it's like huge. + +00:28:50.960 --> 00:28:56.820 +But until now, I hadn't really thought about starting with uv because it's cross-platform. + +00:28:57.180 --> 00:29:02.100 +And if I say to people in the first 10 minutes of class, install uv for your platform and + +00:29:02.100 --> 00:29:05.100 +then say uv in it, your project, bam, you're done. + +00:29:05.320 --> 00:29:06.060 +It just works. + +00:29:06.440 --> 00:29:07.200 +And then it works cross-platform. + +00:29:07.580 --> 00:29:08.520 +This is mind-blowing. + +00:29:08.700 --> 00:29:10.140 +And I'm going to try this at some point. + +00:29:10.220 --> 00:29:10.540 +Thank you. + +00:29:10.640 --> 00:29:14.860 +I can comment on the mind-blowing part because now when I teach undergraduate students, we + +00:29:14.880 --> 00:29:20.360 +start with uv in the very first class. And it is awesome. There were things that would take students, + +00:29:20.820 --> 00:29:25.920 +even very strong students who've had lots of experience, it would still take them a week to + +00:29:26.160 --> 00:29:30.160 +set everything up on their new laptop and get everything ready and to understand all the key + +00:29:30.580 --> 00:29:36.640 +concepts and know where something is in their path. And now we just say, install uv for your + +00:29:36.900 --> 00:29:42.820 +operating system and get running on your computer. And then, hey, you're ready to go. And I don't have + +00:29:42.780 --> 00:29:47.040 +teach them about docker containers and i don't have to tell them how to install python with some + +00:29:47.160 --> 00:29:52.900 +package manager all of those things just work and i think from a learning perspective whether you're + +00:29:52.960 --> 00:29:58.540 +in a class or whether you're in a company or whether you're teaching yourself uv is absolutely + +00:29:58.900 --> 00:30:04.960 +awesome i'm actually wondering whether i am the one who is newest to python here i taught myself + +00:30:05.280 --> 00:30:12.740 +python in 2011 so i was like python 2.7 stage but it was my first programming language i was just + +00:30:12.760 --> 00:30:17.920 +procrastinating during my PhD. And I was like, I should learn to program. So I just taught myself + +00:30:18.140 --> 00:30:23.360 +Python. And I can tell you, you do not come from an engineering background. And you're like, + +00:30:23.430 --> 00:30:29.240 +what is Python? What is Python doing? Why am I typing Python to execute this hello world? And + +00:30:29.300 --> 00:30:33.840 +if you're kind of curious, you get down a rabbit hole before you even get to the point where you're + +00:30:33.960 --> 00:30:39.220 +just focusing on learning the basics. And so it's exactly, I was going to say with Reuven, + +00:30:39.240 --> 00:30:43.160 +And like whether you thought about it for teaching, because we're now debating for Humble Data, + +00:30:43.260 --> 00:30:48.040 +which is a beginner's data science community that I'm part of, whether we switch to uv. + +00:30:48.320 --> 00:30:52.180 +This was Chuck's idea because it does abstract away all these details. + +00:30:52.720 --> 00:30:55.320 +The debate I have is, is it too magic? + +00:30:55.820 --> 00:30:59.200 +This is kind of the problem because I also remember learning about things like virtual + +00:30:59.500 --> 00:31:02.540 +environments, because again, this was my first programming language and being like, + +00:31:02.780 --> 00:31:03.740 +oh, it's a very good idea. + +00:31:04.060 --> 00:31:04.980 +This is best practices. + +00:31:05.280 --> 00:31:07.920 +And it's also a debate we have in PyCharm, right? + +00:31:08.220 --> 00:31:14.780 +Like how much do you magic away the fundamentals versus making people think a little bit, but + +00:31:15.140 --> 00:31:15.560 +I'm not sure. + +00:31:15.660 --> 00:31:15.840 +All right. + +00:31:15.920 --> 00:31:18.720 +Like, would you even let somebody run without a virtual environment? + +00:31:19.000 --> 00:31:21.180 +That's like, that's a take you, that's a stance you could take. + +00:31:21.300 --> 00:31:27.620 +I used to when I first learned Python, because it was too complicated, but then I learned + +00:31:27.840 --> 00:31:28.140 +better. + +00:31:28.640 --> 00:31:29.160 +But yes. + +00:31:29.320 --> 00:31:34.380 +The consideration here is like hiding the magic isn't like hiding the details and having + +00:31:34.430 --> 00:31:37.580 +all this magic just work is great as long as it works. + +00:31:38.020 --> 00:31:43.620 +And the question is, how is it going to break down and how are people going to know how to deal + +00:31:43.830 --> 00:31:49.540 +when it breaks down, if you hide all the magic? And I think virtual envs were, or let's say before + +00:31:49.650 --> 00:31:54.740 +we had virtual envs, installing packages was very much in the, you had to know all the details + +00:31:54.910 --> 00:32:00.300 +because it was very likely going to break down in some way right before we had virtual envs, + +00:32:00.560 --> 00:32:04.880 +because you would end up with weird conflicts or multiple copies of a package installed in + +00:32:04.840 --> 00:32:09.820 +different parts of the system. When we got virtual ends, we sort of didn't have to worry about that + +00:32:10.020 --> 00:32:14.380 +anymore because we were trained in that you can just blow away the virtual one and it just works. + +00:32:14.920 --> 00:32:19.900 +And with uv, we're back into, this looks like a single installation. We don't know what's going + +00:32:19.900 --> 00:32:25.360 +to go on, but we've learned, we as a community and also the people working on uv, we have learned + +00:32:25.700 --> 00:32:31.640 +from those earlier mistakes or not, maybe not mistakes, but consequences of the design. + +00:32:32.180 --> 00:32:38.160 +And they have created something that is, that appears to be very stable where it's unlikely + +00:32:38.280 --> 00:32:39.220 +the magic will break. + +00:32:39.460 --> 00:32:43.940 +And when the magic does break, it's obvious what the problem is or, or it automatically + +00:32:44.100 --> 00:32:44.620 +fixes itself. + +00:32:45.100 --> 00:32:50.480 +So like it's not reusing, broken, installations and that kind of thing. + +00:32:50.720 --> 00:32:57.220 +So the risk now, as it turns out, I think as is proven by the community adopting uv so + +00:32:57.860 --> 00:33:00.160 +fast and so willingly, I think it's acceptable. + +00:33:00.180 --> 00:33:01.860 +Well, I think it's, yeah, I think it's proven itself. + +00:33:02.140 --> 00:33:09.380 +It's clear that this is, it's worth the potential of discovering weird edge cases later, both + +00:33:09.520 --> 00:33:16.060 +because it's probably low likelihood, but also the people behind uv Astral have proven that + +00:33:16.200 --> 00:33:18.600 +they would jump in and fix those issues, right? + +00:33:18.600 --> 00:33:22.840 +They would do anything they need to keep uv workable the same way. + +00:33:23.190 --> 00:33:28.920 +And they have a focus that Python as a whole cannot have because they cater to fewer use + +00:33:28.940 --> 00:33:31.200 +cases than Python as a whole needs to. + +00:33:31.320 --> 00:33:36.160 +On the audience, Galano says, as an enterprise tech director in Python coder, I believe we + +00:33:36.300 --> 00:33:40.500 +should hide the magic which empowers the regular employee to do simple things that make their + +00:33:40.620 --> 00:33:40.960 +job easier. + +00:33:41.420 --> 00:33:41.500 +Yeah. + +00:33:41.650 --> 00:33:46.800 +This notion of abstractions, right, has always been there in computer science. + +00:33:47.980 --> 00:33:53.560 +And, you know, we've used tools or languages or systems where we've tried to bring that + +00:33:53.580 --> 00:33:58.600 +abstraction layer up so that we don't have to think about all these details, as I mentioned + +00:33:58.790 --> 00:34:04.080 +before. The question is, that's always the happy path. And when I'm trying to teach somebody + +00:34:04.300 --> 00:34:08.960 +something like, here's how to use this library or here's how to use this tool, I try to be very + +00:34:09.220 --> 00:34:14.659 +opinionated to keep people on that happy path. Like, assume everything's going to work just right. + +00:34:15.020 --> 00:34:19.880 +Here's how you just make you go down that path to get the thing done that you want. The question + +00:34:19.899 --> 00:34:27.200 +really is when things go wrong, how narrow is that abstraction? And are you able, and even when + +00:34:27.200 --> 00:34:31.960 +you're just curious, like what's really going on underneath the hood? Of course, that's not a really + +00:34:31.960 --> 00:34:36.540 +good analogy today because cars are basically computers on wheels that you can't really + +00:34:36.760 --> 00:34:42.540 +understand how they work. But back in your day. But back in my day, we were changing spark plugs, + +00:34:42.620 --> 00:34:49.860 +you know, but and crank that window down. Exactly. So I think we always have to leave that + +00:34:50.280 --> 00:34:56.620 +room for the curious and the bad path where when things go wrong or when you're just like, + +00:34:56.810 --> 00:35:01.540 +you know what, I understand how this works, but I'm kind of curious about what's really going on. + +00:35:01.780 --> 00:35:06.920 +How easy is it for me to dive in and get a little bit more of that background, you know, + +00:35:07.140 --> 00:35:11.620 +a little bit more of that understanding of what's going on. I want the magic to decompose, + +00:35:12.020 --> 00:35:16.880 +right like you should be able to explain the magic path via a more decomposed steps using the tool + +00:35:17.080 --> 00:35:21.760 +all the way down to what the tools like to do behind the scenes just just to admit i the reason + +00:35:21.850 --> 00:35:25.900 +i brought this up and i've been thinking about this a lot is i'm thinking of trying to get the python + +00:35:26.140 --> 00:35:31.380 +launcher to do a bit more because one interesting thing we haven't really brought up here is we're + +00:35:31.390 --> 00:35:36.960 +all seeing uv uv uv uv is a company there's always there's they might disappear and we haven't + +00:35:36.980 --> 00:35:41.640 +de-risked ourselves from that. Now we do have Hatch, we do have PDM, but as I said, there's kind + +00:35:41.640 --> 00:35:45.440 +of a baseline I think they all share that I think they would be probably okay if the Python launcher + +00:35:45.580 --> 00:35:48.800 +just did because that's based on standards, right? Because that's the other thing that there's been + +00:35:48.800 --> 00:35:52.600 +a lot of work that has led to this step, right? Like we've gotten way more packaging standards, + +00:35:52.900 --> 00:35:58.220 +we've got PEP 723, like we mentioned. There's a lot of work that's come up to lead to this point + +00:35:58.460 --> 00:36:03.920 +that all these tools can lean on to have them all have an equivalent outcome because it's expected + +00:36:03.940 --> 00:36:04.700 +is how they should be. + +00:36:05.199 --> 00:36:07.740 +And so I think it's something we need to consider + +00:36:08.210 --> 00:36:09.360 +of how do we make sure, + +00:36:09.820 --> 00:36:10.740 +like, by the way, uv, + +00:36:11.230 --> 00:36:12.140 +I know the people, they're great. + +00:36:12.280 --> 00:36:13.640 +I'm not trying to spares them + +00:36:13.760 --> 00:36:14.620 +or think they're going to go away, + +00:36:14.780 --> 00:36:16.320 +but it is something we have to consider. + +00:36:16.800 --> 00:36:18.100 +And I will also say, Jody, + +00:36:18.480 --> 00:36:19.780 +I do think about this for teaching + +00:36:20.080 --> 00:36:21.480 +because I'm a dad now + +00:36:21.880 --> 00:36:23.460 +and I don't want my kid coming home + +00:36:24.180 --> 00:36:25.820 +when they get old enough to learn Python + +00:36:25.910 --> 00:36:26.720 +and go, hey, dad, + +00:36:26.880 --> 00:36:28.980 +why is getting Python code running so hard? + +00:36:29.240 --> 00:36:31.420 +So I want to make sure that that never happens. + +00:36:32.160 --> 00:36:34.680 +But they fall in love with it from the start. + +00:36:34.940 --> 00:36:37.600 +I realized something for the 2026 year interview. + +00:36:37.680 --> 00:36:42.960 +I have to bring a sign that says time for next topic because we got a bunch of topics and we're running low on time. + +00:36:43.640 --> 00:36:45.660 +So, Thomas, let's jump over to yours. + +00:36:46.120 --> 00:36:48.240 +Oh, and I had two topics as well. + +00:36:48.400 --> 00:36:52.340 +So I'm only going to have to pick my favorite child, right? + +00:36:52.540 --> 00:36:52.920 +That's terrible. + +00:36:53.900 --> 00:36:57.960 +My second favorite child is Lazy Imports, which is a relatively new development. + +00:36:58.280 --> 00:37:00.000 +So we'll probably not get to that. + +00:37:00.020 --> 00:37:00.120 +And just accepted. + +00:37:00.300 --> 00:37:02.360 +Yes, it's been accepted and it's going to be awesome. + +00:37:03.140 --> 00:37:06.620 +So I'll just give that a shout out and then move to my favorite child, which is free threaded + +00:37:06.850 --> 00:37:06.980 +Python. + +00:37:07.580 --> 00:37:11.640 +For those who were not aware, the global interpreter lock is going away. + +00:37:12.160 --> 00:37:13.700 +I am stating it as a fact. + +00:37:13.840 --> 00:37:17.780 +It's not actually a fact yet, but it, you know, that's because the steering council hasn't + +00:37:18.240 --> 00:37:19.460 +realized the fact yet. + +00:37:20.860 --> 00:37:21.940 +It is trending towards. + +00:37:22.320 --> 00:37:28.820 +Well, I was originally on the steering council that accepted the proposal to add free threading + +00:37:28.840 --> 00:37:34.500 +as a, as an experimental feature, we had this idea of adding it as experimental and then making it + +00:37:34.600 --> 00:37:39.300 +supported, but not the default and then making it the default. And it was all a little vague and, + +00:37:39.500 --> 00:37:43.800 +and up in the air. And then I didn't get reelected for the steering council last year, + +00:37:44.060 --> 00:37:49.420 +which I was not sad about at all. I sort of ran on a, well, if there's nobody better, I'll do it, + +00:37:49.520 --> 00:37:54.820 +but otherwise I have other things to do. And it turns out those other things were making sure that + +00:37:54.840 --> 00:37:59.700 +prefer that Python landed in a supported state. So I lobbied the steering council quite hard, + +00:38:00.040 --> 00:38:04.740 +as Barry might remember at the start of the year, to get some movement on this, like get some + +00:38:04.900 --> 00:38:10.500 +decision going. So for Python 3.14, it is officially supported. The performance is great. It's like + +00:38:10.820 --> 00:38:16.360 +between a couple of percent slower and 10% slower, depending on the hardware and the compiler that + +00:38:16.360 --> 00:38:23.240 +you use. It's basically the same speed on macOS, which is really like it's, that's a combination of + +00:38:23.380 --> 00:38:29.240 +the ARM hardware and Clang specializing things, but it's basically the same speed, which, wow. + +00:38:29.580 --> 00:38:35.040 +And then on recent GCCs on Linux, it's like a couple of percent slower. The main problem is + +00:38:35.080 --> 00:38:40.380 +really community adoption, getting third-party packages to update their extension modules for + +00:38:40.780 --> 00:38:48.560 +the new APIs and the things that by necessity sort of broke, and also supporting free threading in a + +00:38:48.420 --> 00:38:54.060 +in a good way and in packages for Python code, it turns out there's very few changes that + +00:38:54.070 --> 00:38:56.840 +need to be made for things to work well under free threading. + +00:38:57.400 --> 00:39:02.680 +They might not be entirely thread safe, but usually like almost always in cases where it + +00:39:02.920 --> 00:39:07.080 +wasn't thread saved before either, because the guild doesn't actually affect thread safety. + +00:39:07.820 --> 00:39:09.300 +Just the likelihood of things breaking. + +00:39:09.600 --> 00:39:14.920 +I do think there's been a bit of a, the mindset of the Python community hasn't really been + +00:39:14.940 --> 00:39:18.380 +focused on creating thread safe code because the GIL is supposed to protect us. + +00:39:18.460 --> 00:39:22.160 +But soon as it takes multiple steps, then all of a sudden it's just less likely. + +00:39:22.340 --> 00:39:23.540 +It's not that it couldn't happen. + +00:39:23.780 --> 00:39:24.760 +Yeah, that's my point, right? + +00:39:24.860 --> 00:39:27.160 +It's not the GIL never gave you threat safety. + +00:39:27.460 --> 00:39:30.720 +The GIL gave cpythons internals threat safety. + +00:39:31.120 --> 00:39:35.980 +It never really affected Python code and it very rarely affected thread safety in + +00:39:36.260 --> 00:39:37.480 +extension modules as well. + +00:39:37.620 --> 00:39:41.780 +So they already had to take care of, of making sure that the global interpreter + +00:39:41.800 --> 00:39:46.560 +couldn't be released by something that they ended up calling indirectly so it's actually not that + +00:39:46.760 --> 00:39:52.980 +hard to port most things to support free threading and the benefits we've seen some experimental + +00:39:53.030 --> 00:39:56.320 +work because you know it's still it's still new there's still a lot of things that don't + +00:39:56.720 --> 00:40:02.060 +quite support it there's still places where thread contention slows things things down a lot but + +00:40:02.180 --> 00:40:10.020 +we've seen a lot of examples of really promising very parallel problems that now speed up by 10x or + +00:40:09.980 --> 00:40:15.560 +more. And it's going to be really excited in the future. And it's in 2025 that this all started. + +00:40:15.820 --> 00:40:21.620 +I mean, Sam started it earlier, but he's been working on this for years, but it landed in 2025. + +00:40:21.880 --> 00:40:26.760 +It dropped its experimental stage in 314, basically. Yeah. I was going to say, were we all, + +00:40:27.040 --> 00:40:30.760 +the three of us on the steering council at the same time when we decided to start the experiment + +00:40:30.940 --> 00:40:36.320 +for free threading? I think Barry wasn't on it. Yeah, I missed a couple of years there, but I'm + +00:40:36.340 --> 00:40:36.720 +Not sure. + +00:40:36.920 --> 00:40:37.640 +No, I totally agree. + +00:40:37.640 --> 00:40:48.600 +I think free threading is one of the most transformative developments for Python, certainly since Python 3, but even maybe more impactful because of the size of the community today. + +00:40:49.300 --> 00:40:56.420 +Personally, you know, not necessarily speaking as a current or potentially former steering council member. + +00:40:56.680 --> 00:40:58.480 +We'll see how that shakes out. + +00:40:58.720 --> 00:41:00.080 +But I think it's inevitable. + +00:41:00.460 --> 00:41:08.360 +I think free threading is absolutely the future of Python, and I think it's going to unlock incredible potential and performance. + +00:41:08.930 --> 00:41:10.460 +I think we just have to do it right. + +00:41:10.630 --> 00:41:15.800 +And so I talked to lots of teams who are building various software all over the community. + +00:41:16.640 --> 00:41:23.580 +And I actually think it's more of an educational and maybe an outreach problem than it is a technological problem. + +00:41:23.720 --> 00:41:28.800 +I mean, yes, there are probably APIs that are missing that will make people's lives easier. + +00:41:30.300 --> 00:41:36.040 +There's probably some libraries that will make other code a little easier to write or whatever or to understand. + +00:41:36.900 --> 00:41:37.980 +But like all that's solvable. + +00:41:38.100 --> 00:41:42.620 +And I think really reaching out to the teams that are, you know, like Thomas said, + +00:41:42.640 --> 00:41:46.860 +that are building the ecosystem, that are moving the ecosystem to a free threading world. + +00:41:47.380 --> 00:41:50.040 +That's where we really need to spend our effort on. + +00:41:50.160 --> 00:41:51.100 +And we'll get there. + +00:41:51.360 --> 00:41:52.140 +It won't be that long. + +00:41:52.400 --> 00:41:55.480 +It certainly won't be as long as it took us to get to Python 3. + +00:41:56.920 --> 00:42:04.000 +I'm sort of curious as someone who's not super experienced with threading or, you know, basic concurrency. + +00:42:04.560 --> 00:42:13.820 +I mean, I've used it, but I feel like now we have threads, especially with free threading and sub interpreters and multiprocessing and asyncio. + +00:42:14.360 --> 00:42:19.560 +And I feel like for many people now it's like, oh, my God, which one am I supposed to use? + +00:42:19.920 --> 00:42:24.020 +And for someone who's experienced, you can sort of say, well, this seems like a better choice. + +00:42:24.140 --> 00:42:31.060 +But are there any plans to sort of try to have a taxonomy of what problems are solved by which of these? + +00:42:31.380 --> 00:42:37.360 +The premise here is that everyone would be using one or more of these low-level techniques that you mentioned. + +00:42:37.700 --> 00:42:40.620 +And I think that's not a good way of looking at it. + +00:42:40.860 --> 00:42:46.340 +Like AsyncIO is a library that you want to use for the things that AsyncIO is good at. + +00:42:46.580 --> 00:42:52.220 +And you can actually very nicely combine it with multiprocessing, with subprocesses, with + +00:42:52.510 --> 00:42:57.700 +so that subprocesses and subinterpreters, just to make it clear that those are two very separate + +00:42:57.860 --> 00:43:00.800 +things and multithreading, both with and without free threading. + +00:43:01.100 --> 00:43:06.560 +And it solves different problems or it gives you different abilities within the AsyncIO + +00:43:06.760 --> 00:43:06.900 +framework. + +00:43:06.950 --> 00:43:09.100 +And the same is true for like GUI frameworks. + +00:43:09.730 --> 00:43:14.360 +I mean, GUI frameworks usually want threads for multiple reasons, but you can use these + +00:43:14.490 --> 00:43:15.300 +other things as well. + +00:43:15.720 --> 00:43:21.580 +I don't think it's down to teaching end users when to use or avoid all these different things. + +00:43:22.070 --> 00:43:26.960 +I think we need higher level abstractions for tasks that people want to solve. + +00:43:27.270 --> 00:43:32.980 +And then those can decide on what for their particular use case is a better approach. + +00:43:33.170 --> 00:43:35.840 +For instance, PyTorch has multiple. + +00:43:36.090 --> 00:43:42.180 +So it's used for people who don't know to train, not just train, but it's used in AI + +00:43:42.200 --> 00:43:45.720 +for generating large matrices and LLMs and what have you. + +00:43:46.100 --> 00:43:49.160 +Part of it is loading data and processing. + +00:43:49.680 --> 00:43:52.640 +And the basic ideas of AsyncIO are, + +00:43:52.860 --> 00:43:54.720 +oh, you can do all these things in parallel + +00:43:55.040 --> 00:43:56.500 +because you're not waiting on the CPU, + +00:43:56.720 --> 00:43:57.680 +you're just waiting on IO. + +00:43:58.200 --> 00:44:00.720 +Turns out it is still a good idea to use threads + +00:44:00.960 --> 00:44:02.400 +for massively parallel IO + +00:44:02.600 --> 00:44:05.740 +because otherwise you end up waiting longer than you need to. + +00:44:05.940 --> 00:44:10.120 +So a problem where we thought AsyncIO would be the solution + +00:44:10.140 --> 00:44:15.520 +and we never needed threads is actually much improved if we tie in threads as well. + +00:44:15.730 --> 00:44:19.200 +And we've seen massive, massive improvements in data loader. + +00:44:19.320 --> 00:44:23.660 +There's even an article, a published article from some people at Meta + +00:44:24.200 --> 00:44:28.840 +showing how much they improve the PyTorch data loader by using multiple threads. + +00:44:29.210 --> 00:44:33.240 +But at a very low level, we don't want end users to need to make that choice, right? + +00:44:33.300 --> 00:44:35.380 +I concur to futures is a good point, right? + +00:44:35.500 --> 00:44:39.300 +Like all of these approaches are all supported there and it's a unified one. + +00:44:39.880 --> 00:44:42.320 +So if you were to teach this, for instance, you could say use concurrent.tot futures. + +00:44:42.880 --> 00:44:43.500 +These are all there. + +00:44:44.130 --> 00:44:45.460 +This is the potential tradeoff. + +00:44:45.550 --> 00:44:46.800 +Like basically use threads. + +00:44:46.940 --> 00:44:51.280 +It's going to be the fastest unless there's like some module you have that's not that's + +00:44:51.400 --> 00:44:53.040 +screwing up because of threads, then use sub interpreters. + +00:44:53.540 --> 00:44:57.540 +And if for some reason sub interpreters don't work, you should move to the processing pool, + +00:44:57.710 --> 00:44:58.340 +the process pool. + +00:44:58.680 --> 00:45:02.140 +But I mean, basically, you just kind of just like, it's not go sort the fast stuff. + +00:45:02.190 --> 00:45:03.300 +And for some reason, it doesn't work. + +00:45:03.390 --> 00:45:05.420 +Use the next fastest and just kind of do it that way. + +00:45:05.860 --> 00:45:07.820 +After that, then you start to the lower level. + +00:45:08.240 --> 00:45:11.240 +Like, okay, why do I want to use subinterpreters instead of threads? + +00:45:11.730 --> 00:45:12.260 +Those kinds of threads. + +00:45:12.390 --> 00:45:15.600 +But I think that's a different, as I think we're all searching, + +00:45:15.830 --> 00:45:18.840 +a different level of abstraction, which is a term we keep bringing up today. + +00:45:19.520 --> 00:45:21.520 +It's a level that a lot of people are not going to have to care about. + +00:45:21.520 --> 00:45:23.540 +I think the libraries are the ones that are going to have to care about this + +00:45:23.570 --> 00:45:25.520 +and who are going to do a lot of this for you. + +00:45:25.620 --> 00:45:29.400 +Let me throw this out on our way out the door to get to Reuven's topic. + +00:45:29.920 --> 00:45:32.860 +I would love to see it solidify around async and await. + +00:45:33.150 --> 00:45:36.140 +And you just await a thing, maybe put a decorator on something. + +00:45:36.260 --> 00:45:42.040 +say this, this one, I want this to be threaded. I want this to be IO. I want this. And you don't, + +00:45:42.100 --> 00:45:46.080 +you just use async and await and don't have to think about it, but that's, that's my dream. + +00:45:46.800 --> 00:45:47.800 +Reuven, what's your dream? + +00:45:48.700 --> 00:45:49.740 +Wow. How long do you have? + +00:45:51.000 --> 00:45:51.600 +No, what's your topic? + +00:45:52.640 --> 00:45:58.480 +So I want to talk about Python ecosystem and funding. When I talk to people with Python + +00:45:59.060 --> 00:46:01.820 +and I talk to them about it, how it's open source, they're like, oh, right, it's open source. That + +00:46:01.940 --> 00:46:05.360 +means I can download it for free. And from their perspective, that's sort of where it starts + +00:46:05.640 --> 00:46:10.460 +and ends. And the notion that people work on it, the notion that needs funding, the notion that + +00:46:10.620 --> 00:46:15.060 +there's a Python software foundation that supports a lot of these activities, the infrastructure + +00:46:15.580 --> 00:46:21.400 +is completely unknown to them and even quite shocking for them to hear. But Python is in many + +00:46:21.560 --> 00:46:26.720 +ways, I think, starting to become a victim of its own success, that we've been dependent on + +00:46:27.280 --> 00:46:32.740 +companies for a number of years to support developers and development. And we've been + +00:46:32.700 --> 00:46:38.080 +assuming that the PSF, which gives money out to lots of organizations to run conferences and + +00:46:38.290 --> 00:46:43.380 +workshops and so forth, can sort of keep scaling up and that they will have enough funding. And + +00:46:43.520 --> 00:46:48.500 +we've seen a few sort of shocks that system in the last year. Most recently, the PSF announced that + +00:46:48.500 --> 00:46:52.680 +it was no longer going to be doing versus sort of pared down about a year ago, what it would give + +00:46:52.840 --> 00:46:56.620 +money for. And then about five months ago, six months ago, I think it was in July or August, + +00:46:56.800 --> 00:46:59.700 +they said, actually, we're not going to be able to fund anything for about a year now. + +00:47:00.000 --> 00:47:17.200 +And then there was the government grant, I think from the NSF that they turned down. And I'm not disputing the reasons for that at all. It basically, it said, well, we'll give you the money if you don't worry about diversity and inclusion. And given that that's like a core part of what the PSF is supposed to do, they could not do that without shutting the doors, which would be kind of counterproductive. + +00:47:17.540 --> 00:47:26.480 +And so I feel like we're not yet there, but we're sort of approaching this, I'm going to term like a problem crisis in funding Python. + +00:47:27.040 --> 00:47:32.020 +The needs of the community keep growing and growing, whether it's workshops, whether it's PyPI, whether it's conferences. + +00:47:32.740 --> 00:47:34.640 +And companies are getting squeezed. + +00:47:35.160 --> 00:47:41.760 +And the number of people, it always shocks me every time there are PSF elections, the incredibly small number of people who vote. + +00:47:42.080 --> 00:47:45.680 +Which means that, let's assume half the people who are members, third of the people. + +00:47:46.000 --> 00:47:50.580 +Like for the millions and millions of people who program Python out there, an infinitesimally + +00:47:50.650 --> 00:47:53.240 +small proportion of them actually join and help to fund it. + +00:47:53.430 --> 00:47:56.620 +So I'm not quite sure what to do with this other than express concern. + +00:47:57.110 --> 00:48:01.760 +But I feel like we've got to figure out ways to fund Python and the PSF in new ways that + +00:48:01.760 --> 00:48:04.440 +will allow it to grow and scale as needed. + +00:48:04.620 --> 00:48:05.620 +I couldn't agree more. + +00:48:06.220 --> 00:48:11.760 +Obviously, the PSF is close to my heart because I was on the board for, I think, a total of + +00:48:11.920 --> 00:48:15.520 +six or nine years or something over, you know, the last 25. + +00:48:15.640 --> 00:48:21.520 +I was also for six months, I was the interim general manager because Eva left and we hadn't + +00:48:21.550 --> 00:48:23.480 +hired Deb yet while I was on the board. + +00:48:23.670 --> 00:48:28.940 +I remember signing the sponsor contracts for the companies that came in wanting to sponsor + +00:48:29.170 --> 00:48:29.320 +Python. + +00:48:29.760 --> 00:48:35.260 +And it is like, it's ridiculous how, and I can say this working for a company that is + +00:48:35.310 --> 00:48:38.360 +one of the biggest sponsors of the PSF and has done so for years. + +00:48:38.920 --> 00:48:45.160 +It's ridiculous how small those sponsorships are and yet how grateful we were that they + +00:48:45.080 --> 00:48:47.860 +came in because every single one has such a big impact. + +00:48:48.060 --> 00:48:51.620 +You can do so much good with the money that comes in. + +00:48:52.020 --> 00:48:55.140 +We need more corporate sponsorships more than we need. + +00:48:55.300 --> 00:49:00.580 +Like, I mean, obviously a million people giving us a couple of bucks, giving the PSF, let's + +00:49:00.580 --> 00:49:00.900 +be clear. + +00:49:00.960 --> 00:49:01.780 +I'm not on the board anymore. + +00:49:01.980 --> 00:49:05.160 +Giving the PSF a couple of bucks would be fantastic. + +00:49:05.680 --> 00:49:11.880 +But I think the big players in the big corporate players where all the AI money is, for instance, + +00:49:12.060 --> 00:49:17.380 +having done basically no sponsorship of the PSF is mind-boggling. It is a textbook + +00:49:18.420 --> 00:49:25.220 +tragedy of the commons right there, right? They rely entirely on PyPI and PyPI is run entirely + +00:49:25.220 --> 00:49:30.820 +with community resources, mostly because of very generous and consistent sponsorship, + +00:49:31.180 --> 00:49:38.860 +basically by Fastly, but also the other sponsors of the PSF. And yet very large players use those + +00:49:38.880 --> 00:49:45.640 +resources more than anyone else and don't actually contribute. Georgie Kerr, she wrote this fantastic + +00:49:45.730 --> 00:49:52.000 +blog post saying pretty much this straight after Europython. So Europython this year was really big + +00:49:52.260 --> 00:49:57.240 +actually. And she was wandering around looking at the sponsor booths and the usual players were there, + +00:49:57.620 --> 00:50:03.980 +but none of these AI companies were there. And the relationship actually between AI, if you want to + +00:50:03.800 --> 00:50:09.320 +call it that. Let's call it ML and neural networks. And like some of the really big companies and + +00:50:09.610 --> 00:50:14.700 +Python actually is really complex. Obviously, a lot of these companies and some of us are here, + +00:50:15.160 --> 00:50:20.920 +employ people to work on Python. Companies like Meta and Google have contributed massively to + +00:50:21.140 --> 00:50:27.140 +frameworks like PyTorch, TensorFlow, Keras. So it's not as simple a picture as saying cough up money + +00:50:27.570 --> 00:50:32.620 +all the time. Like there's a more complex picture here, but definitely there are some notable + +00:50:32.900 --> 00:50:39.200 +absences. And we talked about the volume of money going through. I totally agree with the sentiment. + +00:50:39.460 --> 00:50:45.720 +When the shortfall came and the grants program had to shut down, we were brainstorming at JetBrains, + +00:50:45.820 --> 00:50:51.140 +like maybe we can do some sort of, I don't know, donate some more money and call other companies + +00:50:51.260 --> 00:50:55.820 +to do it. Or we can call on people in the community. And I was like, I don't want to call + +00:50:55.820 --> 00:50:59.760 +on people in the community to do it because they're probably the same people who are also + +00:50:59.780 --> 00:51:05.620 +donating their time for Python. Like it's just squeezing people who give so much of themselves + +00:51:05.880 --> 00:51:10.700 +to this community even more. And it's not sustainable. Like Reuben said, if we keep doing + +00:51:10.980 --> 00:51:15.800 +this, the whole community is going to collapse. Like I'm sure we've all had our own forms of + +00:51:16.030 --> 00:51:19.780 +burnout from giving too much. I'm going to pat ourselves on the back here. Everyone on this + +00:51:19.940 --> 00:51:25.440 +call who works at a company are all sponsors of the PSF. Thank goodness. But there's obviously a + +00:51:25.440 --> 00:51:29.100 +lot of people not on this call who are not sponsors. And I know personally, I wished every + +00:51:29.120 --> 00:51:35.220 +company that generated revenue from python usage donated to the psf like and it doesn't see and i + +00:51:35.300 --> 00:51:38.440 +think part of the problem is some people think it has to be like a hundred thousand dollars it does + +00:51:38.520 --> 00:51:43.380 +not have to be a hundred thousand dollars now if you can afford that amount please do or more there + +00:51:43.430 --> 00:51:48.140 +are many ways to donate more than the maximum amount for getting on the website but it's one + +00:51:48.140 --> 00:51:51.500 +of these funny things where a lot of people just like oh it's not me right like even startups don't + +00:51:51.780 --> 00:51:56.300 +some do to give those ones credit but others don't because like oh we're we're burning through + +00:51:56.320 --> 00:52:01.840 +capital level i was like yeah but we're not we're asking for like less so you'd pay a dev right by + +00:52:01.840 --> 00:52:07.280 +a lot per year right like the amount we actually asked for to get to the highest tier is still less + +00:52:07.340 --> 00:52:12.920 +than a common developer in silicon valley if we're gonna price point to a geograph geogra geographical + +00:52:13.080 --> 00:52:18.520 +location we call kind of comprehend i'm gonna steal a net bachelor's observation here and yeah what + +00:52:18.520 --> 00:52:25.740 +the psf would be happy with is less than a medium company spends on the tips of expensed meals every + +00:52:25.760 --> 00:52:31.340 +year. Yeah. Yeah. And it's a long running problem, right? Like, I mean, I've been on the PSF for a + +00:52:31.370 --> 00:52:36.380 +long time, too. I've not served as many years as Thomas on the board, but I was like executive + +00:52:36.470 --> 00:52:40.780 +vice president because we had to have someone with that title at some point. It's always been a + +00:52:40.940 --> 00:52:45.520 +struggle, right? Like I and I also want to be clear, I'm totally appreciative of where we have + +00:52:45.630 --> 00:52:50.660 +gotten to, right? Because for the longest time, I was just dying for paid staff on the core team. + +00:52:50.830 --> 00:52:55.120 +And now we have three developers as residents. Thank goodness. Still not enough to be clear. + +00:52:55.320 --> 00:52:55.920 +I want five. + +00:52:56.300 --> 00:52:57.080 +And I've always said that, + +00:52:57.340 --> 00:52:58.320 +but I'll happily take three. + +00:52:58.740 --> 00:52:59.500 +But it's one of these things + +00:52:59.560 --> 00:53:00.480 +where it's a constant struggle. + +00:53:00.740 --> 00:53:02.360 +And it got a little bit better + +00:53:02.600 --> 00:53:03.240 +before the pandemic + +00:53:03.460 --> 00:53:04.480 +just because everyone + +00:53:04.520 --> 00:53:05.480 +was spending on conferences + +00:53:05.840 --> 00:53:07.240 +and PyCon US is a big driver + +00:53:07.320 --> 00:53:08.420 +for the Python Software Foundation. + +00:53:08.860 --> 00:53:10.300 +And I know your Python's a driver + +00:53:10.340 --> 00:53:11.560 +for the European Society. + +00:53:12.140 --> 00:53:12.880 +But then COVID hit + +00:53:13.340 --> 00:53:14.980 +and conferences haven't picked back up. + +00:53:15.180 --> 00:53:16.480 +And then there's a whole new cohort + +00:53:16.880 --> 00:53:19.400 +of companies that have come in post-pandemic + +00:53:19.740 --> 00:53:20.720 +that have never had that experience + +00:53:21.020 --> 00:53:22.340 +of going to PyCon and sponsoring PyCon. + +00:53:22.440 --> 00:53:23.220 +And so they don't think about, + +00:53:23.360 --> 00:53:25.280 +I think, sponsoring PyCon + +00:53:25.300 --> 00:53:29.140 +the PSF because that's also a big kind of in your face, you should help sponsor this. + +00:53:29.570 --> 00:53:33.840 +And I think it's led to this kind of lull where offered spending has gone down, new entrants + +00:53:33.890 --> 00:53:37.480 +into the community have not had that experience and thought about it. And it's led to this kind + +00:53:37.480 --> 00:53:43.600 +of dearth where, yeah, that PSF had to stop giving out grant money. And it sucks. And I would love + +00:53:43.600 --> 00:53:47.600 +to see it not be that problem. I want to add one interesting data point that I discovered in + +00:53:47.690 --> 00:53:53.800 +short. Keep it short. Yes. NumFocus has about twice the budget of the PSF. I was shocked to + +00:53:53.820 --> 00:54:00.360 +discover this. So basically it is possible to get money from companies to sponsor development of + +00:54:00.540 --> 00:54:05.480 +Python related projects. And I don't know what they're doing that we aren't. And I think it's + +00:54:05.540 --> 00:54:10.960 +worth talking and figuring it out. We need a fundraiser and marketer in residence, maybe. Who + +00:54:10.960 --> 00:54:17.920 +knows? Lauren does a great job, to be clear. The PSF has Lauren and Lauren is that. But it's still + +00:54:18.040 --> 00:54:21.980 +hard. We have someone doing it full time at the PSF and it's just hard to get companies to give + +00:54:22.020 --> 00:54:22.660 +cash up cash. + +00:54:23.500 --> 00:54:24.880 +Yeah, and what do we get in return? + +00:54:25.280 --> 00:54:26.080 +Well, we already get that. + +00:54:26.260 --> 00:54:27.380 +So, yeah, I know. + +00:54:27.840 --> 00:54:28.580 +All right, Barry. + +00:54:28.800 --> 00:54:30.220 +To just, you know, shift gears + +00:54:30.600 --> 00:54:31.620 +into a different area, + +00:54:32.020 --> 00:54:34.560 +something that I've been thinking a lot + +00:54:34.720 --> 00:54:36.580 +over this past year on the steering council. + +00:54:36.860 --> 00:54:38.000 +Thomas, I'm sure, is going to be, + +00:54:38.160 --> 00:54:39.200 +you know, very well aware, + +00:54:39.680 --> 00:54:41.620 +having been instrumental + +00:54:42.040 --> 00:54:45.180 +in the lazy imports PEP A10. + +00:54:45.620 --> 00:54:47.540 +We have to sort of rethink + +00:54:48.260 --> 00:54:50.220 +how we evolve Python + +00:54:50.900 --> 00:54:53.420 +and how we pose changes to Python + +00:54:53.960 --> 00:54:56.440 +and how we discuss those changes in the community. + +00:54:56.590 --> 00:54:59.940 +Because I think one of the things that I have heard + +00:55:00.230 --> 00:55:03.780 +over and over and over again is that authoring PEPs + +00:55:04.260 --> 00:55:08.780 +is incredibly difficult and emotionally draining + +00:55:09.500 --> 00:55:10.940 +and it's a time sink. + +00:55:11.620 --> 00:55:15.860 +And leading those discussions on discuss.python.org, + +00:55:15.990 --> 00:55:17.520 +which we typically call DPO, + +00:55:18.420 --> 00:55:21.400 +can be toxic at times and very difficult. + +00:55:21.800 --> 00:55:24.340 +So one of the things that I realized + +00:55:24.650 --> 00:55:25.960 +as I was thinking about this + +00:55:25.960 --> 00:55:28.760 +is that peps are 25 years old now, right? + +00:55:29.240 --> 00:55:30.820 +So we've had this, + +00:55:31.320 --> 00:55:32.980 +and not only just peps are old, + +00:55:33.110 --> 00:55:35.700 +but like we've gone through at least two, + +00:55:35.890 --> 00:55:38.520 +if not more sort of complete revolutions + +00:55:38.590 --> 00:55:40.040 +in the way we discuss things. + +00:55:40.170 --> 00:55:42.900 +You know, the community has grown incredibly. + +00:55:43.290 --> 00:55:45.720 +The developer community is somewhat larger, + +00:55:46.180 --> 00:55:47.660 +but just the number of people + +00:55:47.680 --> 00:55:54.380 +who are using Python and who have an interest in it has grown exponentially. So it has become + +00:55:54.900 --> 00:56:01.400 +really difficult to evolve the language in the standard library and the interpreter. And we need + +00:56:01.400 --> 00:56:08.760 +to sort of think about how we can make this easier for people and not lose the voice of the user. + +00:56:09.160 --> 00:56:14.500 +And the number of people who actually engage in topics on DPO is the tip of the iceberg. You know, + +00:56:14.600 --> 00:56:24.740 +We've got millions and millions of users out there in the world who, for example, lazy imports will affect, free threading will affect and don't even know that they have a voice. + +00:56:24.890 --> 00:56:31.200 +And maybe we have to basically represent that, but we have to do it in a much more collaborative and positive way. + +00:56:31.740 --> 00:56:33.400 +That's something that I've been thinking about a lot. + +00:56:33.500 --> 00:56:43.580 +And whether or not I'm on the steering council next year, I think this is something that I'm going to spend some time on trying to think about, you know, talk to people about ways we can make this easier for everyone. + +00:56:43.820 --> 00:56:47.340 +The diversity of use cases for Python in the last couple of years. + +00:56:47.390 --> 00:56:48.000 +So complex. + +00:56:48.420 --> 00:56:48.980 +Yes, exactly. + +00:56:49.210 --> 00:56:53.060 +It should also be preface that Barry created the PEP process. He should have started that one. + +00:56:55.780 --> 00:56:57.280 +It is that old. + +00:56:57.610 --> 00:56:57.720 +Yeah. + +00:56:58.540 --> 00:57:02.120 +By the way, just so everyone knows, these are not ages jokes to be mean to Barry. + +00:57:02.340 --> 00:57:07.080 +We've always known Barry long enough that we know Barry's okay with us making these jokes. + +00:57:07.110 --> 00:57:07.900 +To be very, very clear. + +00:57:07.900 --> 00:57:11.920 +Also, I am almost as old as Barry, although I don't look as old as Barry. + +00:57:12.780 --> 00:57:14.300 +Yeah, we're all over from the same age anyways. + +00:57:15.300 --> 00:57:18.600 +Yeah, Barry and I have known each other for 25 years, + +00:57:18.960 --> 00:57:21.760 +and I've always made these jokes of him. + +00:57:21.940 --> 00:57:26.220 +So it is different when you know each other in person. + +00:57:26.420 --> 00:57:27.000 +Let's put it that way. + +00:57:28.840 --> 00:57:30.900 +For the PEP process, I think for a lot of people, + +00:57:31.160 --> 00:57:35.080 +it's not obvious how difficult the process is. + +00:57:35.260 --> 00:57:36.760 +I mean, it wasn't even obvious to me. + +00:57:37.260 --> 00:57:40.720 +I saw people avoiding writing peps multiple times, + +00:57:40.820 --> 00:57:42.940 +and I was upset, like on the steering council, right? + +00:57:43.380 --> 00:57:45.280 +I saw people making changes where I thought, + +00:57:45.440 --> 00:57:46.820 +this is definitely something + +00:57:47.020 --> 00:57:48.420 +that should have been discussed in a PEP + +00:57:48.700 --> 00:57:51.220 +and the discussion should be recorded in a PEP and all that. + +00:57:51.540 --> 00:57:54.600 +And I didn't understand why they didn't until, + +00:57:55.260 --> 00:57:56.800 +basically until PEP 8.10. + +00:57:56.980 --> 00:57:58.920 +So I did PEP 779, + +00:57:58.990 --> 00:58:02.400 +which was the giving free threading supported status + +00:58:02.820 --> 00:58:03.660 +at the start of the year. + +00:58:03.980 --> 00:58:06.200 +And the discussion there was, you know, + +00:58:06.560 --> 00:58:08.540 +sort of as expected and it's already, + +00:58:08.760 --> 00:58:10.280 +was already an accepted PEP. + +00:58:10.600 --> 00:58:12.560 +It was just the question of how does it become supported? + +00:58:12.990 --> 00:58:14.240 +That one wasn't too exhausting. + +00:58:14.770 --> 00:58:20.040 +And then we got to Lazy Imports, which was Pablo, who is another steering council member, + +00:58:20.460 --> 00:58:24.720 +as well as a bunch of other contributors, including me and two of my co-workers and + +00:58:24.820 --> 00:58:28.900 +one of my former co-workers, who had all had a lot of experience with Lazy Imports, but + +00:58:29.040 --> 00:58:31.540 +not necessarily as much experience with the PEP process. + +00:58:32.140 --> 00:58:37.680 +And Pablo took the front seat because he knew the PEP process and he's done like five PEPs + +00:58:37.680 --> 00:58:40.000 +in the last year or something, some ridiculous number. + +00:58:40.360 --> 00:58:48.880 +And he shared with us the vitriol he got for like offline for the, just the audacity of proposing + +00:58:49.120 --> 00:58:54.240 +something that people disagreed with or something. And that was like, this is a technical suggestion. + +00:58:54.300 --> 00:58:59.740 +This is not a code of conduct issue where I have received my fair share of vitriol around. + +00:59:00.160 --> 00:59:06.200 +This is a technical discussion. And yet he gets this, these ridiculous accusations in his mailbox. + +00:59:06.740 --> 00:59:11.980 +And for some reason, only the primary author gets it as well, which is just weird to me. + +00:59:12.140 --> 00:59:13.360 +But people are lazy. + +00:59:13.780 --> 00:59:15.420 +Thomas is what I think you just said. + +00:59:15.640 --> 00:59:24.500 +Remember, the steering council exists because Guido was the got the brunt of this for Pet 572, which was the walrus operator. + +00:59:24.680 --> 00:59:30.660 +Right. Which is just like this minor little syntactic thing that is kind of cool when you need it. + +00:59:31.020 --> 00:59:42.340 +But like just the amount of anger and negative energy and vitriol that he got over that was enough to for him to just say, I'm out, you know, and you guys figure it out. + +00:59:42.400 --> 00:59:48.280 +And that cannot be an acceptable way to discuss the evolution of the language. + +00:59:48.420 --> 00:59:55.100 +Especially since apparently now every single PEP author of any contentious or semi contentious pep. + +00:59:55.420 --> 00:59:58.820 +Although I have to say, Pep 810 had such broad support. + +00:59:59.030 --> 01:00:00.940 +It was hard to call it contentious. + +01:00:01.060 --> 01:00:04.240 +It's just there's a couple of very loud opinions, I guess. + +01:00:04.600 --> 01:00:06.720 +And I'm not saying we shouldn't listen to people. + +01:00:06.770 --> 01:00:10.980 +We should definitely listen to especially contrary opinions. + +01:00:11.130 --> 01:00:12.460 +But there has to be a limit. + +01:00:12.750 --> 01:00:15.880 +There has to be an acceptable way of bringing things up. + +01:00:15.890 --> 01:00:20.760 +There has to be an acceptable way of saying, hey, you didn't actually read the Pep. + +01:00:21.120 --> 01:00:26.720 +please go back and reconsider everything you said after you fully digested the things, + +01:00:27.180 --> 01:00:29.020 +because everything's already been addressed in the pep. + +01:00:29.400 --> 01:00:37.480 +It's just really hard to do this in a way that doesn't destroy the relationship with the person you're telling this, right? + +01:00:37.980 --> 01:00:44.020 +It's hard to tell people, hey, I'm not going to listen to you because you haven't, you know, you've done a bad job. + +01:00:44.520 --> 01:00:45.860 +You've chosen not to inform yourself. + +01:00:46.220 --> 01:00:54.860 +I think you make another really strong point, Thomas, which is that there have been changes that have been made to Python that really should have been a pep. + +01:00:55.220 --> 01:01:01.480 +And they aren't because people don't want to go through core developers, don't want to go through this gauntlet. + +01:01:01.570 --> 01:01:03.700 +And so they'll create a PR and then that. + +01:01:03.700 --> 01:01:06.880 +But that's also not good because then, you know, we don't have that. + +01:01:06.970 --> 01:01:10.520 +We don't have the right level of consideration. + +01:01:11.160 --> 01:01:19.880 +And you think about the way that, you know, if you're in your job and you're making a change to something in your job, you have a very close relationship to your teammates. + +01:01:20.220 --> 01:01:26.000 +And so you have that kind of respect and hopefully, right, like compassion and consideration. + +01:01:26.720 --> 01:01:35.600 +And you can have a very productive discussion about a thing and you may win some arguments and you may lose some arguments, but the team moves forward as one. + +01:01:35.840 --> 01:01:38.500 +And I think we've lost a bit of that in Python. + +01:01:38.920 --> 01:01:39.940 +So that's not great. + +01:01:40.000 --> 01:01:50.220 +I think society in general could use a little more civility and kindness, especially to strangers that they haven't met in forums, social media, driving, you name it. + +01:01:50.760 --> 01:01:54.160 +Okay, but we're not going to solve that here, I'm sure. + +01:01:54.420 --> 01:01:56.920 +So instead, let's do Gregory's topic. + +01:01:57.260 --> 01:02:04.160 +Hey, I'm going to change topics quite a bit, but I wanted to call 2025 the year of type checking and language server protocols. + +01:02:04.800 --> 01:02:13.780 +So many of us probably have used tools like mypy to check to see if the types line up in our code or whether or not we happen to be overriding functions correctly. + +01:02:14.170 --> 01:02:19.980 +And so I've used mypy for many years and loved the tool and had a great opportunity to chat with the creator of it. + +01:02:20.260 --> 01:02:23.800 +And I integrate that into my CI and it's really been wonderful. + +01:02:24.180 --> 01:02:28.600 +And I've also been using a lot of LSPs, like, for example, PyRite or PyLands. + +01:02:28.900 --> 01:02:33.860 +But in this year, one of the things that we've seen is, number one, Pyrefly from the team at Meta. + +01:02:34.160 --> 01:02:36.600 +We've also seen ty from the team at Astral. + +01:02:36.930 --> 01:02:38.600 +And there's another one called Zubon. + +01:02:38.910 --> 01:02:40.760 +And Zubon is from David Halter. + +01:02:41.120 --> 01:02:43.480 +David was also the person who created JEDI, + +01:02:43.770 --> 01:02:47.620 +which is another system in Python that helped with a lot of LSP tasks. + +01:02:48.040 --> 01:02:51.440 +What's interesting about all three of the tools that I just mentioned + +01:02:51.800 --> 01:02:53.320 +is that they're implemented in Rust, + +01:02:53.800 --> 01:02:58.080 +and they have taken a lot of the opportunity to make the type checker + +01:02:58.260 --> 01:03:01.040 +and or the LSP significantly faster. + +01:03:01.480 --> 01:03:07.640 +So for me, this has changed how I use the LSP or the type checker and how frequently I use it. + +01:03:07.880 --> 01:03:16.580 +And in my experience, it has helped me to take things that might take tens of seconds or hundreds of seconds and cut them down often to less than a second. + +01:03:17.080 --> 01:03:23.860 +And it's really changed the way in which I'm using a lot of the tools like ty or Pyrefly or Zubon. + +01:03:24.180 --> 01:03:31.800 +So I can have some more details if I'm allowed to share, Michael, but I would say 2025 is the year of type checkers and LSPs. + +01:03:31.980 --> 01:03:34.320 +I think given the timing, let's have people give some feedback. + +01:03:34.540 --> 01:03:38.680 +I personally have been using Pyrefly a ton and am a big fan of it. + +01:03:38.740 --> 01:03:42.600 +I don't know if I'm allowed to have an opinion that isn't Pyrefly is awesome. + +01:03:43.380 --> 01:03:48.500 +I mean, I'm not on the Pyrefly team, but I do regularly chat with people from the Pyrefly team. + +01:03:49.040 --> 01:03:50.840 +Tell people real quick what it is, Thomas. + +01:03:51.180 --> 01:03:55.740 +So Pyrefly is Meta's attempt at a Rust-based type checker. + +01:03:56.140 --> 01:03:57.920 +And so it's very similar to ty. + +01:03:58.360 --> 01:04:01.240 +Started basically at the same time, a little later. + +01:04:01.920 --> 01:04:06.100 +Meta originally had a type checker called Pyre, which was written in OCaml. + +01:04:06.480 --> 01:04:09.020 +They basically decided to start a rewrite in Rust. + +01:04:09.300 --> 01:04:11.220 +And then that really took off. + +01:04:11.380 --> 01:04:13.660 +And that's where we're going now. + +01:04:13.900 --> 01:04:14.000 +Yeah. + +01:04:14.140 --> 01:04:14.220 +Yeah. + +01:04:14.230 --> 01:04:17.960 +I don't know what I can say because I'm actually on the same team as the Pylands team. + +01:04:18.460 --> 01:04:20.780 +So, but no, I mean, I think it's good. + +01:04:21.060 --> 01:04:22.920 +I think this is one of those interesting scenarios + +01:04:23.660 --> 01:04:25.180 +where some people realize like, + +01:04:25.340 --> 01:04:27.260 +you know what, we're going to pay the penalty + +01:04:28.059 --> 01:04:30.580 +of writing a tool in a way that's faster, + +01:04:30.740 --> 01:04:31.640 +but makes us go slower + +01:04:31.880 --> 01:04:33.220 +because the overall win for the community + +01:04:33.420 --> 01:04:34.840 +is going to be a good win. + +01:04:34.900 --> 01:04:36.460 +So it's worth that headache, right? + +01:04:36.780 --> 01:04:38.240 +Not to say I don't want to scare people off + +01:04:38.360 --> 01:04:39.640 +from writing Rust, but let's be honest, + +01:04:39.700 --> 01:04:41.040 +it takes more work to write Rust code + +01:04:41.140 --> 01:04:42.360 +than it does take to write Python code. + +01:04:42.940 --> 01:04:44.700 +But some people chose to make that trade off + +01:04:44.740 --> 01:04:45.960 +and we're all benefiting from it. + +01:04:46.260 --> 01:04:47.180 +The one thing I will say + +01:04:47.280 --> 01:04:48.280 +that's kind of interesting from this + +01:04:48.620 --> 01:04:50.060 +that hasn't gotten a lot of play yet + +01:04:50.120 --> 01:04:50.940 +because it's still being developed, + +01:04:51.060 --> 01:05:05.580 +But PyLens is actually working with the Pyrefly team to define a type server protocol, TSP, so that a lot of these type servers can just kind of feed the type information to a higher level LSP and let that LSP handle the stuff like symbol renaming and all that stuff. + +01:05:05.620 --> 01:05:12.300 +Right. Because the key thing here and the reason there's so many different type checkers is there are there is a spec. + +01:05:12.460 --> 01:05:16.500 +Right. And everyone's trying to implement it. But there's differences like in type in terms of type inferencing. + +01:05:16.800 --> 01:05:19.100 +And if I actually go listen to Michael's interview, + +01:05:19.780 --> 01:05:21.460 +talk Python to me with the Pyrefly team, + +01:05:21.560 --> 01:05:23.080 +they actually did a nice little explanation + +01:05:23.140 --> 01:05:24.760 +of the difference between Pyrites approach + +01:05:25.600 --> 01:05:26.860 +and Pyrefly's approach. + +01:05:27.420 --> 01:05:28.360 +And so there's a bit of variance. + +01:05:28.640 --> 01:05:31.140 +But for instance, I think there's some talk now + +01:05:31.360 --> 01:05:32.440 +of trying to like, how do we make it + +01:05:32.440 --> 01:05:33.520 +so everyone doesn't have to reimplement + +01:05:33.640 --> 01:05:34.980 +how to rename a symbol, right? + +01:05:35.100 --> 01:05:35.680 +That's kind of boring. + +01:05:35.840 --> 01:05:37.300 +That's not where the interesting work is. + +01:05:37.640 --> 01:05:39.940 +And that's not performant from perspective of + +01:05:40.180 --> 01:05:42.920 +you want instantaneously to get that squiggly red line + +01:05:43.500 --> 01:05:45.039 +in whether it's VS Code + +01:05:45.040 --> 01:05:48.220 +or it's in PyCharm or whatever your editor is, right? + +01:05:48.380 --> 01:05:50.340 +You want to get it as fast as possible, + +01:05:50.500 --> 01:05:51.340 +but the rename- + +01:05:51.660 --> 01:05:51.860 +Jupyter. + +01:05:52.160 --> 01:05:52.280 +Jupyter. + +01:05:52.400 --> 01:05:53.520 +No, not Emacs. + +01:05:53.520 --> 01:05:53.860 +Everything but Emacs. + +01:05:53.900 --> 01:05:56.020 +No, not Emacs. + +01:05:56.020 --> 01:05:57.640 +Just to bring things full circle, + +01:05:57.640 --> 01:06:00.500 +it's that focus on user experience, right? + +01:06:00.680 --> 01:06:03.020 +Which is, yes, you want that squiggly line, + +01:06:03.020 --> 01:06:04.740 +but when things go wrong, + +01:06:04.740 --> 01:06:05.640 +when your type checker says, + +01:06:05.640 --> 01:06:07.780 +oh, you've got a problem, + +01:06:07.780 --> 01:06:11.820 +you know, like I think about as an analogy, + +01:06:11.820 --> 01:06:14.600 +how Pablo has done an amazing amount of work + +01:06:14.860 --> 01:06:14.920 +on the error reporting, right? + +01:06:15.020 --> 01:06:26.460 +When you get an exception and, you know, now you have a lot more clues about what is it that I actually have to change to make the tool, you know, to fix the problem, right? + +01:06:26.780 --> 01:06:39.740 +Like so many times years ago, you know, when people were using mypy, for example, and they'd have some complex failure of their type annotations and have absolutely no idea what to do about it. + +01:06:40.140 --> 01:06:44.480 +And so getting to a place where now we're not just telling people you've done it wrong, + +01:06:44.960 --> 01:06:48.560 +but also here's some ideas about how to fix it. + +01:06:49.540 --> 01:06:54.340 +I think this is a full circle here because honestly, using typing in your Python code + +01:06:54.600 --> 01:06:57.020 +gives a lot of context to the AI when you ask for help. + +01:06:57.200 --> 01:06:59.320 +If you just give it a fragment and it can't work with it. + +01:06:59.480 --> 01:07:00.080 +That's true. + +01:07:00.360 --> 01:07:06.320 +And also, if you can teach your AI agent to use the type checkers and use the LSPs, + +01:07:06.640 --> 01:07:08.660 +it will also generate better code for you. + +01:07:09.000 --> 01:07:24.320 +I think the one challenge I would add to what Barry said a moment ago is that if you're a developer and you're using, say, three or four type checkers at the same time, you also have to be careful about the fact that some of them won't flag an error that the other one will flag. + +01:07:24.680 --> 01:07:36.300 +So I've recently written Python programs and even built a tool with one of my students named Benedek that will automatically generate Python programs that will cause type checkers to disagree with each other. + +01:07:39.320 --> 01:07:44.580 +I will flag it as an error, but none of the other tools will flag it as an error. + +01:07:45.020 --> 01:07:50.020 +And there are also cases where the new tools will all agree with each other, but disagree with mypy. + +01:07:50.360 --> 01:07:53.100 +So there is a type checker conformance test suite. + +01:07:53.280 --> 01:07:57.280 +But I think as developers, even though it might be the year of LSP and type checker, + +01:07:57.300 --> 01:08:02.980 +we also have to be aware of the fact that these tools are maturing and there's still disagreement among them. + +01:08:03.400 --> 01:08:07.700 +and also just different philosophies when it comes to how to type check and how to infer. + +01:08:08.240 --> 01:08:12.320 +And so we have to think about all of those things as these tools mature and become part of our ecosystem. + +01:08:12.480 --> 01:08:14.040 +Yeah, Greg, that last point is important. + +01:08:14.240 --> 01:08:17.940 +Out of curiosity, how did the things where the type checkers disagree + +01:08:18.560 --> 01:08:21.380 +match up with the actual runtime behavior of Python? + +01:08:21.980 --> 01:08:24.480 +Was it like false positives or false negatives? + +01:08:24.880 --> 01:08:25.759 +That's a really good question. + +01:08:26.000 --> 01:08:29.900 +I'll give you more details in the show notes because we actually have it in a GitHub repository + +01:08:30.220 --> 01:08:31.339 +and I can share it with people. + +01:08:31.740 --> 01:08:43.200 +But I think some of it might simply be related to cases where mypy is more conformant to the spec, but the other new tools are not as conformant. + +01:08:43.370 --> 01:08:48.180 +So you can import overload from typing and then have a very overloaded function. + +01:08:48.720 --> 01:08:59.960 +And mypy will actually flag the fact that it's an overloaded function with multiple signatures, whereas PyRite and Pyrefly and Zubon will not actually flag that, even though they should. + +01:09:00.160 --> 01:09:02.819 +Another big area is optional versus not optional. + +01:09:03.380 --> 01:09:03.480 +Yes. + +01:09:03.700 --> 01:09:08.960 +Like, are you allowed to pass a thing that is an optional string when the thing accepts a string? + +01:09:09.180 --> 01:09:10.380 +Some stuff's like, yeah, it's probably fine. + +01:09:10.420 --> 01:09:11.359 +Others are like, no, no, no. + +01:09:11.400 --> 01:09:13.540 +This is an error that you have to do a check. + +01:09:13.720 --> 01:09:23.259 +And if you want to switch type checkers, you might end up with a thousand warnings that you didn't previously had because of an intentional difference of opinion on how strict to be, I think. + +01:09:23.400 --> 01:09:23.560 +Yeah. + +01:09:23.680 --> 01:09:29.220 +So you have to think about false positives and false negatives when you're willing to break the build because of a type error. + +01:09:29.299 --> 01:09:31.720 +All of those things are things you have to factor in. + +01:09:31.990 --> 01:09:37.040 +But to go quickly to this connection to AI, I know it's only recently, but the Pyrefly + +01:09:37.130 --> 01:09:41.859 +team actually announced that they're making Pyrefly work directly with Pydantic AI. + +01:09:42.400 --> 01:09:46.640 +So there's going to be an interoperability between those tools so that when you're building + +01:09:46.710 --> 01:09:52.160 +an AI agent using Pydantic AI, you can also then have better guarantees when you're using + +01:09:52.339 --> 01:09:53.580 +Pyrefly as your type checker. + +01:09:53.680 --> 01:09:57.659 +It makes total sense, though, because then the reasoning LLM that's at the core of the + +01:09:57.560 --> 01:10:04.220 +agent can actually have that information before it tries to execute the code and you don't get in that + +01:10:04.220 --> 01:10:09.840 +loop that they often get in. You can correct it before it runs. Yeah, really good point. I want to + +01:10:09.840 --> 01:10:14.660 +just sort of express my appreciation to all the people working on this typing stuff. As someone + +01:10:14.820 --> 01:10:27.520 +who's come from many, many years in dynamic languages, I was always like, oh, typing. Those + +01:10:27.540 --> 01:10:32.740 +E, I love seeing how easy it is for people to ease into it when they're in Python. + +01:10:32.980 --> 01:10:33.900 +It's not all or nothing. + +01:10:34.460 --> 01:10:36.660 +C, I love the huge number of tools. + +01:10:36.760 --> 01:10:39.160 +The competition in this space is really exciting. + +01:10:39.640 --> 01:10:40.620 +And D, guess what? + +01:10:40.840 --> 01:10:41.960 +It really, really does help. + +01:10:42.240 --> 01:10:46.760 +And I'll even add an E, which is my students who come from Java, C++, C#, and so forth + +01:10:47.140 --> 01:10:47.920 +feel relief. + +01:10:48.400 --> 01:10:53.080 +They find that without type checking, it's like doing a trapeze act without a safety + +01:10:53.220 --> 01:10:53.400 +net. + +01:10:53.760 --> 01:10:57.160 +And so they're very happy to have that typing in there, + +01:10:57.380 --> 01:10:57.880 +typings in there. + +01:10:58.200 --> 01:10:59.240 +So kudos to everyone. + +01:10:59.480 --> 01:11:00.620 +All right, folks, we are out of time. + +01:11:00.940 --> 01:11:03.040 +This could literally go for hours longer. + +01:11:04.640 --> 01:11:05.300 +It was a big year. + +01:11:05.760 --> 01:11:06.720 +It was a big year, + +01:11:06.860 --> 01:11:09.940 +but I think we need to just have a final word. + +01:11:10.800 --> 01:11:12.380 +I'll start and we'll just go around. + +01:11:12.720 --> 01:11:14.200 +So my final thought here is, + +01:11:14.660 --> 01:11:16.700 +we've talked about some things that are negatives + +01:11:17.060 --> 01:11:19.100 +or sort of downers or whatever here and there, + +01:11:19.640 --> 01:11:22.860 +but I still think it's an incredibly exciting time + +01:11:22.880 --> 01:11:26.080 +To be a developer, data scientist, there's so much opportunity out there. + +01:11:26.600 --> 01:11:29.480 +There's so many things to learn and take advantage of and stay on top of. + +01:11:29.640 --> 01:11:30.280 +And amazing. + +01:11:30.600 --> 01:11:32.860 +Every day is slightly more amazing than the previous day. + +01:11:33.040 --> 01:11:33.860 +So I love it. + +01:11:34.080 --> 01:11:34.900 +Gregory, let's go to you next. + +01:11:35.000 --> 01:11:35.580 +Let's go around the circle. + +01:11:35.780 --> 01:11:39.400 +Yeah, I wanted to give a shout out to all of the local Python conferences. + +01:11:40.200 --> 01:11:43.660 +I actually, on a regular basis, have attended the PyOhio conference. + +01:11:44.280 --> 01:11:45.320 +And it is incredible. + +01:11:45.620 --> 01:11:48.460 +The organizers do an absolutely amazing job. + +01:11:49.000 --> 01:11:54.120 +And they have it hosted on a campus, oftentimes at Ohio State or Cleveland State University. + +01:11:54.820 --> 01:12:00.740 +And incredibly, PyOhio is a free conference that anyone can attend with no registration fee. + +01:12:01.050 --> 01:12:06.180 +So Michael, on a comment that I think is really positive, wow, I'm so excited about the regional + +01:12:06.430 --> 01:12:08.340 +Python conferences that I've been able to attend. + +01:12:08.500 --> 01:12:08.660 +Thomas. + +01:12:09.020 --> 01:12:10.640 +Wow, I didn't expect this. + +01:12:10.840 --> 01:12:16.520 +So I think I want to give a shout out to new people joining the community and also joining + +01:12:16.540 --> 01:12:22.060 +just core developer team as triagers or it's just drive by commenters. I know we harped a little bit + +01:12:22.160 --> 01:12:27.520 +about people, you know, giving strong opinions and discussions, but I always look to the far future + +01:12:27.720 --> 01:12:33.300 +as well as the near future. And we always need new people. We need new ideas. We need new opinions. So + +01:12:33.840 --> 01:12:38.960 +yeah, I'm, I'm excited that there's still people joining and signing up and even when it's + +01:12:39.240 --> 01:12:44.140 +thankless work. So I guess I want to say thank you to people doing all the thankless work. Jodi. + +01:12:44.380 --> 01:12:51.740 +Yeah, I want to say this is actually really only my third year or so really in the Python community. + +01:12:52.000 --> 01:12:54.220 +So before that, I was just sort of on the fringes, right? + +01:12:54.460 --> 01:12:58.020 +And after I started advocacy, I started going to the conferences and meeting people. + +01:12:58.660 --> 01:13:04.680 +And I think I didn't kind of get how special the community was until I watched the Python documentary this year. + +01:13:04.920 --> 01:13:11.840 +And I talked to Paul about this, Paul Everett afterwards, also made fun of him for his like early 2000s fashion. + +01:13:11.940 --> 01:13:18.400 +but I think, yeah, like I'm a relative newcomer to this community and you've all made me feel so + +01:13:18.600 --> 01:13:24.000 +welcome. And I guess I want to thank all the incumbents for everything you've done to make + +01:13:24.120 --> 01:13:30.100 +this such a special tech community for minorities and everyone, newbies, you know, Python, + +01:13:30.610 --> 01:13:32.860 +Python is love. Oh, geez. How am I supposed to follow that? + +01:13:37.230 --> 01:13:41.900 +I think one of the interesting things that we're kind of looping on here is + +01:13:41.920 --> 01:13:45.340 +I think the language evolution has slowed down, but it's obviously not stopped, right? + +01:13:45.520 --> 01:13:48.640 +Like as Thomas pointed out, there's a lot more stuff happening behind the scenes. + +01:13:49.580 --> 01:13:54.660 +Lazy imports are coming, and that was a syntactic change, which apparently brings out the mean side of some people. + +01:13:55.100 --> 01:13:57.900 +And we've obviously got our challenges and stuff, but things are still going. + +01:13:58.180 --> 01:13:59.380 +We're still looking along. + +01:13:59.490 --> 01:14:04.900 +We're still trying to be an open, welcoming place for people like Jody and everyone else who's new coming on over + +01:14:05.100 --> 01:14:11.080 +and to continue to be a fun place for all of us slightly grain-beard people who have been here for a long time + +01:14:11.100 --> 01:14:12.000 +to make us want to stick around. + +01:14:12.300 --> 01:14:15.140 +I think it's just more of the same, honestly. + +01:14:15.380 --> 01:14:17.820 +It's all of us just continuing to do what we can + +01:14:17.840 --> 01:14:20.320 +to help out to keep this community being a great place. + +01:14:20.520 --> 01:14:22.480 +And it all just keeps going forward. + +01:14:22.760 --> 01:14:23.860 +And I'll just end with, + +01:14:23.900 --> 01:14:25.920 +if you work for a company that's not sponsored the PSF, + +01:14:26.000 --> 01:14:26.540 +please do so. + +01:14:26.600 --> 01:14:30.680 +It's rare to have, I mean, a programming language + +01:14:30.900 --> 01:14:31.840 +or any sort of tool + +01:14:32.680 --> 01:14:36.040 +where it is both really, really beneficial to your career + +01:14:36.640 --> 01:14:38.860 +and you get to hang out with really special, + +01:14:39.320 --> 01:14:40.420 +nice, interesting people. + +01:14:40.980 --> 01:14:44.680 +And it's easy to take all that for granted if you've been steeped in the community. + +01:14:45.110 --> 01:14:48.520 +I went to a conference about six months ago, a non-Python conference. + +01:14:49.180 --> 01:14:54.760 +And that was shocking to me to discover that all the speakers were from advertisers and sponsors. + +01:14:55.400 --> 01:14:57.240 +Everything was super commercialized. + +01:14:57.580 --> 01:15:00.440 +People were not interested in just like hanging out and sharing with each other. + +01:15:00.880 --> 01:15:04.980 +And it was a shock to me because I've been to basically only Python conferences for so many years. + +01:15:05.050 --> 01:15:08.300 +I was like, oh, that's not the norm in the industry. + +01:15:08.800 --> 01:15:15.640 +So we've got something really special going that not only is good for the people, but good for everyone's careers and mutually reinforcing and helping each other. + +01:15:16.120 --> 01:15:17.580 +And that's really fantastic. + +01:15:17.610 --> 01:15:18.480 +And we should appreciate that. + +01:15:19.140 --> 01:15:20.280 +Barry, final word. + +01:15:20.520 --> 01:15:25.340 +Thomas stole my thunder just a little bit, but just to tie a couple of these ideas together. + +01:15:25.960 --> 01:15:29.360 +Python, and you know, Brett said this, right? + +01:15:29.520 --> 01:15:33.120 +This is Python is the community or the community is Python. + +01:15:33.580 --> 01:15:38.240 +There's no company that is telling anybody what Python should be. + +01:15:38.600 --> 01:15:39.980 +Python is what we make it. + +01:15:40.460 --> 01:15:43.840 +And, you know, as folks like myself get a little older + +01:15:44.140 --> 01:15:47.500 +and, you know, and we have younger people + +01:15:47.700 --> 01:15:48.620 +coming into the community, + +01:15:48.920 --> 01:15:50.760 +both developers and everything else + +01:15:51.280 --> 01:15:54.320 +who are shaping Python into their vision. + +01:15:54.700 --> 01:15:56.120 +I encourage you, + +01:15:56.380 --> 01:15:58.360 +if you've thought about becoming a core dev, + +01:15:58.720 --> 01:15:59.440 +find a mentor. + +01:15:59.660 --> 01:16:01.320 +There are people out there that will help you. + +01:16:01.520 --> 01:16:05.100 +If you want to be involved in the community, the PSF, + +01:16:05.460 --> 01:16:06.260 +you know, reach out. + +01:16:06.320 --> 01:16:08.560 +There are people who will help guide you + +01:16:08.580 --> 01:16:15.680 +this community. You can be involved. Do not let any self-imposed limitations stop you from + +01:16:16.740 --> 01:16:22.440 +becoming part of the Python community in the way that you want to. And eventually run for the + +01:16:22.600 --> 01:16:29.320 +steering council because we need many, many, many more candidates next year. And you don't need any + +01:16:29.580 --> 01:16:33.940 +qualifications either because I'm a high school dropout and I never went to college or anything. + +01:16:34.640 --> 01:16:35.400 +And look at me. + +01:16:35.820 --> 01:16:37.600 +And I have a PhD and I will tell you, + +01:16:37.610 --> 01:16:39.600 +I did not need all that to become a Python developer + +01:16:39.830 --> 01:16:41.980 +because I was the Python developer before I got the PhD. + +01:16:42.540 --> 01:16:43.360 +I'm a bass player. + +01:16:43.430 --> 01:16:44.960 +So if I can do it, anybody can do it. + +01:16:47.920 --> 01:16:49.120 +Thank you everyone for being here. + +01:16:49.520 --> 01:16:50.860 +This awesome look back in the air + +01:16:50.940 --> 01:16:52.220 +and I really appreciate you all taking the time. + +01:16:52.420 --> 01:16:52.860 +Thank you, Michael. + +01:16:53.080 --> 01:16:54.440 +Thanks everybody. + +01:16:54.830 --> 01:16:55.140 +Bye everybody. + +01:16:57.320 --> 01:16:59.700 +This has been another episode of Talk Python To Me. + +01:17:00.080 --> 01:17:00.800 +Thank you to our sponsors. + +01:17:01.010 --> 01:17:02.300 +Be sure to check out what they're offering. + +01:17:02.500 --> 01:17:03.860 +It really helps support the show. + +01:17:04.400 --> 01:17:07.200 +Look into the future and see bugs before they make it to production. + +01:17:08.000 --> 01:17:13.280 +Sentry's Seer AI Code Review uses historical error and performance information at Sentry + +01:17:13.660 --> 01:17:17.620 +to find and flag bugs in your PRs before you even start to review them. + +01:17:18.280 --> 01:17:20.100 +Stop bugs before they enter your code base. + +01:17:20.620 --> 01:17:24.500 +Get started at talkpython.fm/seer-code-review. + +01:17:25.020 --> 01:17:26.880 +If you or your team needs to learn Python, + +01:17:27.080 --> 01:17:30.560 +we have over 270 hours of beginner and advanced courses + +01:17:30.560 --> 01:17:37.160 +on topics ranging from complete beginners to async code, Flask, Django, HTMX, and even LLMs. + +01:17:37.480 --> 01:17:39.820 +Best of all, there's no subscription in sight. + +01:17:40.320 --> 01:17:41.980 +Browse the catalog at talkpython.fm. + +01:17:42.680 --> 01:17:46.060 +And if you're not already subscribed to the show on your favorite podcast player, + +01:17:46.700 --> 01:17:47.340 +what are you waiting for? + +01:17:47.980 --> 01:17:49.780 +Just search for Python in your podcast player. + +01:17:49.880 --> 01:17:50.760 +We should be right at the top. + +01:17:51.180 --> 01:17:53.940 +If you enjoy that geeky rap song, you can download the full track. + +01:17:54.160 --> 01:17:56.060 +The link is actually in your podcast blur show notes. + +01:17:56.860 --> 01:17:58.220 +This is your host, Michael Kennedy. + +01:17:58.540 --> 01:17:59.680 +Thank you so much for listening. + +01:17:59.980 --> 01:18:00.620 +I really appreciate it. + +01:18:01.120 --> 01:18:01.800 +I'll see you next time. + +01:18:27.680 --> 01:18:29.420 +I think is the norm. + diff --git a/transcripts/533-web-frameworks-in-prod-by-their-creators.txt b/transcripts/533-web-frameworks-in-prod-by-their-creators.txt new file mode 100644 index 0000000..c6bdb4e --- /dev/null +++ b/transcripts/533-web-frameworks-in-prod-by-their-creators.txt @@ -0,0 +1,2138 @@ +00:00:00 Today on Talk Python, the creators behind FastAPI, Flask, Django, Quart, and Litestar + +00:00:05 get practical about running apps based on their frameworks in production. + +00:00:10 Deployment patterns, async gotchas, servers, scalings, and the stuff that you only learn + +00:00:15 at 2 a.m. when the pager starts going off. + +00:00:17 For Django, we have Carlton Gibson and Jeff Triplich. + +00:00:21 For Flask, we have David Lord and Phil Jones. + +00:00:23 And on Team Litestar, we have Yannick Noverde and Cody Fincher. + +00:00:28 And finally, Sebastian Ramirez from FastAPI is here as well. + +00:00:32 Let's jump in. + +00:00:33 This is Talk Python To Me, episode 533, recorded December 17th, 2025. + +00:00:55 Welcome to Talk Python To Me. + +00:00:57 the number one Python podcast for developers and data scientists. + +00:01:01 This is your host, Michael Kennedy. + +00:01:02 I'm a PSF fellow who's been coding for over 25 years. + +00:01:07 Let's connect on social media. + +00:01:08 You'll find me and Talk Python on Mastodon, Bluesky, and X. + +00:01:11 The social links are all in your show notes. + +00:01:14 You can find over 10 years of past episodes at talkpython.fm. + +00:01:18 And if you want to be part of the show, you can join our recording live streams. + +00:01:21 That's right. + +00:01:22 We live stream the raw uncut version of each episode on YouTube. + +00:01:26 Just visit talkpython.fm/youtube to see the schedule of upcoming events. + +00:01:30 Be sure to subscribe there and press the bell so you'll get notified anytime we're recording. + +00:01:35 Hey, before we jump into the interview, I just want to send a little message to all the companies + +00:01:39 out there with products and services trying to reach developers. + +00:01:44 That is the listeners of this show. + +00:01:46 As we're rolling into 2026, I have a bunch of spots open. + +00:01:50 So please reach out to me if you're looking to sponsor a podcast or just generally sponsor + +00:01:56 things in the community and you haven't necessarily considered podcasts, you really should. + +00:02:00 Reach out to me and I'll help you connect with the Talk Python audience. + +00:02:05 Thanks everyone for listening all of 2025. And here we go into 2026. Cheers. + +00:02:11 Hello, hello, Carlton, Sebastian, David, Cody, Yannick, Phil, Jeff, welcome back to Talk Python, + +00:02:19 all of you. Thanks for having us. Thank you for having us. Happy to be here again. We're here for + +00:02:22 what may be my favorite topic for sure. Something I spend most of my time on is web API stuff, + +00:02:30 which is awesome. So excited to have you here to give your inside look at how people should + +00:02:36 be running your framework, at least the one that you significantly contribute to, depending on + +00:02:42 which framework we're talking about, right? It's going to be a lot of fun, and I'm really excited + +00:02:47 to talk about it. However, there's an interesting fact that I've been throwing out a lot lately is + +00:02:51 that fully half of the people doing professional Python development have only been doing it for two + +00:02:56 years or less. And some of you been on the show, it was maybe two years longer than that. Let's just + +00:03:01 do a quick round of introductions for people who don't necessarily know you. We'll go around the + +00:03:06 squares here in the screen sharing. So Carlton, you're up first. Oh, I get to go first. Brilliant. + +00:03:11 Well, I'm Carlton. I work on the Django Red framework mostly. I'm a former Django fellow. + +00:03:16 I maintain a number of packages in the ecosystem. And the last few years I've been back to building + +00:03:20 stuff with Django rather than working on it. So I run a build startup that's, well, we're still + +00:03:25 going. So I'm quite excited about that. Awesome. How is it to be building with Django than building + +00:03:30 Django? Oh, I'm literally having the time of my life. Like I spent five years as a Django fellow + +00:03:36 working on Django and I just built up this backlog of things I wanted to do and I had no time and no + +00:03:42 capacity and no, no sort of nothing to work on with them. And it's just, it's just a delight. + +00:03:46 And every day I sit down on my computer thinking, oh, what's today? + +00:03:50 I look at the background. + +00:03:51 Oh, yes. + +00:03:52 And every day, a delight. + +00:03:54 So I'm still just loving it. + +00:03:56 That's awesome. + +00:03:57 So more often you're appreciating your former self than cursing your former self + +00:04:01 for the way you built. + +00:04:04 Yeah, that's an interesting one. + +00:04:05 I think we should move on before. + +00:04:07 All right. + +00:04:08 All right. + +00:04:09 Speaking of building with and for Sebastian, FastAPI. + +00:04:14 Hello. + +00:04:14 Hello. + +00:04:15 So, okay, intro for the ones that don't know me. + +00:04:18 I'm Sebastian Ramirez. + +00:04:19 I created FastAPI. + +00:04:21 Yeah, that's pretty much it. + +00:04:23 And now I have been building a company since the last two years, FastAPI Cloud, to deploy + +00:04:27 FastAPI. + +00:04:28 So, I get to drink from funny cups, as you can see. + +00:04:33 The world's best boss. + +00:04:35 Amazing. + +00:04:36 So, I think you deserve to give a bit of a shout out to FastAPI Cloud. + +00:04:39 That's a big deal. + +00:04:40 Thank you. + +00:04:40 Thank you very much. + +00:04:41 Yeah, it's super fun. + +00:04:42 And the idea is to make it super simple to deploy FastAPI applications. + +00:04:47 The idea with FastAPI was to make it very simple to build applications, build APIs, + +00:04:52 and get the idea from idea to product in record time. + +00:04:57 That was the idea with FastAPI. + +00:04:59 But then deploying that, in many cases, is just too cumbersome. + +00:05:02 It's too complicated. + +00:05:03 There are just so many things to that. + +00:05:05 So I wanted to bring something for people to be able to say, like, + +00:05:09 hey, just one command FastAPI deploy, and we take care of the rest. + +00:05:12 And then we and the team, I have an amazing thing that I've been able to work with. + +00:05:17 We suffer all the cloud pains so that people don't have to deal with that. + +00:05:21 And yeah, it's painful to build, but it's so cool to use it. + +00:05:25 You know, like that's the part when I say like, yes, this was worth it. + +00:05:29 When I get to use the thing myself, that is super cool. + +00:05:32 MARK MANDEL: Yeah, I'm assuming you build FastAPI Cloud with FastAPI somewhat. + +00:05:35 FRANCISCO MOLIN: Yes, yes, yes, exactly. + +00:05:37 FastAPI Cloud runs on FastAPI Cloud. + +00:05:40 And I get just like now random things in there and like, yes. + +00:05:44 Congrats to that again. + +00:05:45 That's super cool. + +00:05:46 David Lord, welcome. + +00:05:47 Welcome back. + +00:05:48 Yeah. + +00:05:48 Hello. + +00:05:49 I'm David Lord. + +00:05:49 I'm the lead maintainer of Pallets, which is Flask, Jinja, Click, Berkswag. + +00:05:55 It's dangerous, markup safe. + +00:05:56 And now Pallets Eco, which is a bunch of the famous extensions for Flask that are getting + +00:06:02 community maintenance now. + +00:06:04 I've been doing that since, I think I've been the lead maintainer since like 2019, but a + +00:06:08 maintainer since like 2017. + +00:06:09 So it's been a while. + +00:06:10 That's been a while. + +00:06:11 We're coming up on seven, eight years. + +00:06:14 That's crazy. + +00:06:15 Time flies. + +00:06:15 It's always funny because I always feel like I've been doing it for way, way longer. + +00:06:18 And then I look at the actual date that I got added as a maintainer. + +00:06:21 I'm like, well, it couldn't have been that late. + +00:06:22 I was doing stuff before that, right? + +00:06:24 Well, I'm sure you were deep in flask before you got added as a maintainer of it, right? + +00:06:28 Yeah. + +00:06:28 Phil Jones, since you are also on the same org, next. + +00:06:32 Hey, welcome back. + +00:06:32 Hello. + +00:06:33 Yeah, I'm Phil Jones. + +00:06:34 I am the author of Quartz, which is also part of Palette. + +00:06:37 I also work on Berkshagen and Flask and help out there. + +00:06:42 And I've done a server called Hypercorn as well. + +00:06:44 So a bit of interest in that part of the ecosystem. + +00:06:47 What is Quart for people who don't know? + +00:06:50 Quart is basically Flask with async await. + +00:06:53 And that was the idea behind it really to make it possible to do async await. + +00:06:57 So yeah, that's pretty much it. + +00:06:58 If we, when we manage to merge them, we will. + +00:07:00 And the goal now with Quart as part of palettes is to eventually have it be one code base with Flask. + +00:07:07 But given that we both have small children now, we're moving a lot slower. + +00:07:13 Having kids is great. + +00:07:14 I have three kids. + +00:07:15 Productivity is not a thing that they are known to imbue on the parents, right? + +00:07:20 Especially in the early days. + +00:07:21 I want to say, Phil, thank you. + +00:07:23 I've been running Quart for a couple of my websites lately, and it's been amazing. + +00:07:26 Nice. + +00:07:27 Yeah, I also use it at work. + +00:07:29 We've got all our stuff in Quart, which is, yeah, it's really good fun. + +00:07:31 A bit like Carlton. + +00:07:32 So when people, if they get, if they listen to the show or they go to the website of the show and they're not on YouTube, then that somehow involves court. + +00:07:39 Janek, welcome. + +00:07:40 Hey. + +00:07:41 Yeah, I'm Janek de Vietni. + +00:07:42 I work on Litestar. + +00:07:44 I just looked up how long it's been because I was curious myself. + +00:07:48 I also had the same feeling that it's been a lot longer than, it's actually only been three years. + +00:07:53 Yeah. + +00:07:53 And I also, I noticed something with all you guys here in the room. + +00:07:57 I use almost all of the projects you maintain at work, + +00:08:01 which is quite nice. + +00:08:04 We have a very big Django deployment. + +00:08:06 We have some Flask deployments. + +00:08:08 We have a few FastAPI deployments. + +00:08:10 I think we have one core deployment and we also have two Light Store deployments, + +00:08:15 which obviously is a lot of fun to work with. + +00:08:17 And I find it really, really nice actually to work with all these different things. + +00:08:23 It's super interesting also because like everything has its own niche that it's really good at. + +00:08:29 And even, you know, you think if you maintain a framework yourself, + +00:08:33 you tend to always recommend it for everything. + +00:08:36 But I noticed it's not actually true. + +00:08:38 There's actually quite a few cases where I don't recommend Litestar. + +00:08:42 I recommend, you know, just, you know, use Django for this or, you know, + +00:08:47 use Flask for that or use FastAPI for this because, well, they are quite different after all. + +00:08:52 And I find that really, really interesting and nice. And I think it's a good sign of a healthy ecosystem if it's not just, you know, + +00:09:00 the same thing, but different, but it actually brings something very unique and different to + +00:09:04 the table. I think that's a great attitude. And it's very interesting. You know, I feel like + +00:09:08 there's a lot of people who feel like they've kind of got to pick their tech team for everything. + +00:09:13 I'm going to build a static site. Like, well, I've got to have a Python-based static site builder. + +00:09:17 Like, well, it's a static site. Who cares what technology makes it turn? You're writing Markdown, + +00:09:21 and out comes HTML. + +00:09:22 Who cares what's in the middle, for example, right? + +00:09:24 And, you know, I feel like that's kind of a life lessons learned. + +00:09:28 Absolutely, yeah. + +00:09:29 Yeah, that's awesome. + +00:09:30 Cody, hello, hello. + +00:09:31 Yeah, hey guys, I'm Cody Fincher. + +00:09:32 I'm also one of the maintainers of Litestar. + +00:09:34 I've been there just a little bit longer than Yannick. + +00:09:37 And so it's been about four years now. + +00:09:40 And Yannick actually teed this up perfectly because I was going to say something very similar. + +00:09:43 I currently work for Google. + +00:09:44 I've been there for about three and a half years now. + +00:09:46 And we literally have every one of the frameworks you guys just mentioned, + +00:09:50 and they're all in production. + +00:09:51 And so one of the things that you'll see on the Litestar org and part of the projects + +00:09:56 that we maintain are that we have these optional batteries + +00:09:59 and most of the batteries that we have all work with the frameworks for you guys. + +00:10:03 And so it's nice to be able to use that stuff, you know, regardless of what tooling you've got + +00:10:08 or what project it is. + +00:10:10 And so, yeah, having that interoperability and the ability to kind of go between the frameworks + +00:10:14 that work the best for the right situation is crucial. + +00:10:16 And so I'm glad you mentioned that, Yannick. + +00:10:18 But yeah, nice to see you guys on the show. Cody, tell people what Litestar is. I know I had both you guys and Jacob on a while + +00:10:25 ago, but it's been a couple of years, I think. Litestar at its core is really a web framework + +00:10:30 that kind of sits somewhere in between, I'd say, Flask and FastAPI and Django. So whereas, you know, + +00:10:36 Flask doesn't really, you know, bundle a lot of batteries. There's a huge amount of, you know, + +00:10:41 third-party libraries and ecosystem that's built around it that people can add into it, but there's + +00:10:44 not really like, for instance, a database adapter or a database plugin or plugins for + +00:10:50 VEAT or something like that, right, for front end development. And so what we have been doing + +00:10:54 is building a API framework that is very similar in concept to FastAPI that is also extensible. + +00:10:59 So if you want to use the batteries, they're there for you. But if you don't want to use + +00:11:03 them, you don't have to, right? And so a lot of the tooling that we built for LightStore + +00:11:07 was birthed out of a startup that I was in prior to joining Google. And so having all + +00:11:12 this boilerplate, really, it needed somewhere to go. + +00:11:15 And so a lot of this stuff ended up being plugins, which is what we bundled into Litestar + +00:11:19 so that you can kind of add in this extra functionality. + +00:11:22 And so I know I'm getting long-winded. + +00:11:24 It's somewhere between Django and Flask, if you were to think about it in terms of a spectrum, + +00:11:29 in terms of what it gives you in terms of a web framework. + +00:11:32 But in short, it does everything that all the other guys do. + +00:11:35 Very neat. It's definitely a framework I admire. + +00:11:37 Jeff Triplett, so glad you could make it. + +00:11:39 Yeah, thanks for having me. + +00:11:40 Yeah, I'm Jeff Triplett. I'm out of Lawrence, Kansas. + +00:11:42 I'm a consultant at a company called Revolution Systems. + +00:11:45 I was on, some people know me from being on the Python Software Foundation board. + +00:11:48 I've been off that for a few years. + +00:11:50 As of last week, I'm the president of the Django Software Foundation. + +00:11:53 So I've been on that board for a year. + +00:11:54 I'm kind of a Django power user, I guess. + +00:11:56 I've used it for about 20 years. + +00:11:58 And I've kind of not really worked on, I don't even think I have a patch anymore in Django. + +00:12:02 But I've done a lot with the community. + +00:12:04 I've done a lot with contributing through conferences and using utilities. + +00:12:09 I try to promote Carleton's applications like Neapolitan. + +00:12:12 And if I like tools, Python tools in general, I try to advocate for it. + +00:12:16 I've also used all of these applications. + +00:12:18 Litestar, I haven't, but I have a friend who talks about it a lot. + +00:12:21 And so I feel like I know a lot from it. + +00:12:23 As a consultant, we tend to go with the best tool for the job. + +00:12:25 So I've done a little bit of FastAPI. + +00:12:27 I worked with Flask a lot over the years, even though we're primarily a Django shop. + +00:12:31 It just depends on what the client needs. + +00:12:32 And you see a lot of different sizes of web app deployments. + +00:12:36 So I think that's going to be an interesting angle for sure. + +00:12:38 Yeah, absolutely. + +00:12:39 Small ones to hundreds of servers. + +00:12:42 We don't see it as much anymore the last four or five years, especially with like CDNs and caching. + +00:12:46 We just don't see load like we did, you know, 10 years ago or so. + +00:12:50 And then I also do a lot of like small, I kind of call them some of them little dumb projects, but some are just fun. + +00:12:55 Like I've got a FastAPI web ring that I wrote a year ago for April Fool's Day. + +00:13:00 And for some reason that kind of took off and people liked it, even though it was kind of a joke. + +00:13:03 So I started like peppering it on a bunch of sites and I maintain like Django packages. + +00:13:08 I do a newsletter, Django News newsletter, just kind of lots of fun stuff. + +00:13:11 Definitely looking forward to hearing all of your opinions. + +00:13:14 So I've got a bunch of different your app in production topics + +00:13:17 I thought we could just work around or talk over. + +00:13:20 So I thought maybe the first one is what would you recommend, + +00:13:24 or if you don't really have a strong recommendation, what would you choose for yourself to put your app in your framework in production? + +00:13:32 I'm thinking app servers, reverse proxies like Nginx or Caddy. + +00:13:36 Do you go for threaded? + +00:13:37 Try to scale out with threads. + +00:13:39 you try to scale out with processes, Docker, no Docker, Kubernetes. What are we doing here, + +00:13:44 folks? Carlton. I think we'll just keep going around the circle here. So you may get the first + +00:13:49 round of everyone. No, I'll try to mix it up, but let's do it this time. + +00:13:52 I do the oldest school thing in the book. I run Nginx as my front end. I'll stick a + +00:14:00 WSGI server behind it with a pre-fork, a few workers, depending on CPU size, depending on + +00:14:05 the kind of requests I'm handling. These days, in order to handle long-lived requests, + +00:14:10 like server-sent events, that kind of, or WebSocket type things, I'll run an ASGII server as a kind + +00:14:14 of sidecar. I've been thinking about this a lot, actually. But yeah, this is interesting. + +00:14:18 If you're running a small site and you want long-lived requests, just run ASGII. Just use + +00:14:22 ASGII. Because any of the servers, Hypercorn, Uvacorn, Daphne, Grannion is the new hot kid on + +00:14:29 the bot, right? All of those will handle your traffic, no problem. But for me, the scaling + +00:14:34 paddles and whiskey are so well known and just like i can do the maths on the back of the pencil i know + +00:14:39 exactly how to scale it up having done it for so long for me for my core application i would still + +00:14:45 rather use the whiskey server and then limit the async stuff to just to the use cases where it's + +00:14:51 particularly suited so i'll do that um process manager i deploy using systemd if i want to if i + +00:14:58 want a container i'll use podman by systemd it's as old school as it gets i'll very often run a + +00:15:03 a Redis instance on localhost for caching, and that will be it. + +00:15:08 And that will get me an awful long way. + +00:15:09 If I have to schedule, I just get a bigger box. + +00:15:12 And a bigger box. + +00:15:13 Yeah, yeah, yeah. + +00:15:13 I really, really, really need multiple boxes. + +00:15:16 Well, then we'll talk. + +00:15:16 I feel like you and I are in a similar vibe. + +00:15:18 But one thing I want to sort of throw out there to you, + +00:15:20 but also sort of the others is, what are we talking with databases? + +00:15:24 Like, who is bold enough to go SQLite? + +00:15:26 Anyone's going SQLite out there? + +00:15:28 Yeah, it depends, right? + +00:15:30 It just depends on what you're doing, right? + +00:15:31 And how many concurrent users you're going to have. + +00:15:33 It really is amazing there. + +00:15:34 The Palette's website is running on Flask, which I wasn't doing for a while. + +00:15:38 I was doing a static site generator. + +00:15:39 Then I got inspired by Andrew Godwin's static dynamic sites. + +00:15:43 And so it loads up all these markdown files, static markdown files into a SQLite database at runtime + +00:15:50 and then serves off of that because you can query really fast. + +00:15:53 Oh, that's awesome. I love it. + +00:15:54 So I am using SQLite for the Palette's website. + +00:15:56 Yeah, I also do have a few small apps that use SQLite. + +00:16:00 And one recently that's Cody's fault because he put me on that track + +00:16:05 where it's running a SQLite database in the browser because nowadays it's quite easy to do that. + +00:16:12 And then you can do all sorts of stuff with it, like hook into it with DuckDB and perform some analysis. + +00:16:19 So you don't actually need to run any sort of server at all. + +00:16:23 You can just throw some files into Nginx and serve your data. + +00:16:26 And as long as that's static, you have a super, super simple deployment. + +00:16:30 So yeah, definitely SQLite. + +00:16:32 If you can, I like it. + +00:16:34 I agree. + +00:16:35 It's interesting. + +00:16:36 The database probably won't go down with that, probably. + +00:16:38 Let's do this by framework. + +00:16:40 So we'll do vertical slices in the visual here. + +00:16:42 So Jeff. + +00:16:43 Yeah, Django, Postgres, pretty old school stack. + +00:16:46 I think putting a CD in front of anything is just a win. + +00:16:49 So whether you like Fastly or Cloudflare, you get a lot of mileage out of it. + +00:16:52 You learn a lot about caching because it's kind of hard to cache Django by default. + +00:16:56 So you get to play with curl and kind of figure out why very headers are there. + +00:16:59 And it's a good learning experience to get through that. + +00:17:02 I also like Coolify, which is kind of new, at least new to me and new to Michael. + +00:17:06 We talk about this in our spare time a lot. + +00:17:08 It's just kind of a boring service that'll launch a bunch of containers for you. + +00:17:12 There's a bunch of one-click installs, so Postgres is a one-click. + +00:17:15 It also does backups for you, which is really nice to have for free. + +00:17:18 I run a couple dozen sites through it and really like it. + +00:17:21 You can either do a hosted forum, I don't get any money from it, or you can run the open-source version. + +00:17:26 I do both. + +00:17:27 I've got like a home lab that I just run stuff using the open-source version. + +00:17:30 And for five bucks a month, it's worth it to run a couple servers. + +00:17:33 And like Carlton said, you can just scale up. + +00:17:35 Yeah, it's got a bunch of one-click deploy for self-hosted SaaS things as well. + +00:17:40 Like I want an analytics stack of containers that run in its own isolated bit. + +00:17:44 Just click here and go. + +00:17:46 Yeah, one-click, it's installed and you're up. + +00:17:48 And then once you get one Django, Flask, FastAPI site working with it, + +00:17:53 and it uses like a Docker container. + +00:17:54 Once you get that set up, it's really easy to just kind of duplicate that site, + +00:17:58 plug it in to GitHub or whatever your Git provider is. + +00:18:01 And it's a nice experience for what normally is just our syncing files + +00:18:05 and life's too short for that. + +00:18:06 Sebastian, I want to have you go last on this one because I think you've got something pretty interesting + +00:18:12 with FastAPI Cloud to dive into. + +00:18:14 But let's do Litestar next. Cody. + +00:18:16 I have actually bought all the way in on Gradient. + +00:18:19 So for the ASCII server, I've actually been running Gradient now + +00:18:22 for I'd say a year in production. + +00:18:24 It's worked pretty well. + +00:18:26 There's a couple of new things that I'm actually kind of + +00:18:28 with. I don't know how well they're going to work out. So I'm going to go ahead and throw this out + +00:18:31 there. But Granian is one of the few ASCII servers that supports HTTP2. And it actually can do HTTP2 + +00:18:38 clear text. And so this is part of the next thing I'm going to say. Because I work for Google, I'm + +00:18:42 actively using lots of Kubernetes and Cloud Run mainly. And so most of the things that I deploy + +00:18:47 are containerized on Cloud Run. And I typically would suggest if you're not using something like + +00:18:53 SystemD and deploying it directly on bare metal, then you are going to want to let the container + +00:18:57 or whatever you're using to manage your processes, manage that and spin that up. + +00:19:01 And so I typically try to allocate, you know, like one CPU for the container + +00:19:05 and let the actual framework scale it up and down as needed. + +00:19:09 Cloud Run itself has a, like an ingress, like a load balancer that sits in front + +00:19:13 that it automatically configures. + +00:19:14 And you're required to basically serve up clear text traffic in when you run Cloud Run. + +00:19:19 And because now Gradient supports HTTP2 and Cloud Run supports HTTP2 clear text, + +00:19:25 you can now serve Granian as HTTP2 traffic. + +00:19:28 The good thing about that is that you get an unlimited upload size. + +00:19:31 And so there are max thresholds to what you can upload into the various cloud environments. + +00:19:35 HTTP2 usually circumvents that or gets around it because of the way the protocol works. + +00:19:39 And so you get additional features and functionality because of that. + +00:19:42 So anyway, that's what I typically do. + +00:19:44 And most of my databases are usually Postgres, AlloyDB if it needs to be something that's on the analytical side. + +00:19:50 Yeah, I'm on Team Granian as well. + +00:19:52 I think that's a super neat framework. + +00:19:53 I had Giovanni on who's behind it a while ago. + +00:19:57 It seems like it's not as popular, but it's based on Hyper from the Rust world, + +00:20:02 which has like 130,000 projects based on it or something. + +00:20:05 So, you know, at its core, it's still pretty battle-tested. + +00:20:11 this portion of talk python to me is brought to you by our course just enough python for data + +00:20:16 scientists if you live in notebooks but need your work to hold up in the real world check out just + +00:20:21 enough python for data scientists it's a focused code first course that tightens the python you + +00:20:26 actually use and adds the habits that make results repeatable we refactor messy cells into functions + +00:20:33 and packages, use Git on easy mode, lock environments with uv, and even ship with Docker. + +00:20:39 Keep your notebook speed, add engineering reliability. + +00:20:42 Find it at Talk Python Training. + +00:20:44 Just click courses in the navbar at talkpython.fm. + +00:20:47 Yannick, how about you? + +00:20:49 You've got a variety, it sounds like. + +00:20:50 Yeah, definitely. + +00:20:53 There's a pretty clear split between what I do at work and what I do outside of that. + +00:20:59 So at work, it's Kubernetes deployments. + +00:21:01 And we managed that pretty much the same way that Cody described. + +00:21:05 So it's one or two processes per pod max. + +00:21:10 So you can have Kubernetes scaled or even manually easily scale that up. + +00:21:14 You can just go into Kubernetes and say, OK, do me one to five more pods or whatever. + +00:21:20 And don't have to worry. + +00:21:21 Don't have to start calculating whatever. + +00:21:23 Most of the stuff we run nowadays with uv corn or Django deployment + +00:21:28 up until I think three months ago or so was running under the Unicorn, + +00:21:34 but we switched that actually. + +00:21:35 And it's been a really great experience. + +00:21:38 I think we tried that a year ago and it didn't work out quite so well. + +00:21:42 There was some things that didn't work as expected or didn't perform great + +00:21:47 or Django was throwing some errors or Uvicorn was throwing some errors. + +00:21:52 And then apparently all of that got fixed because now it runs without any issue for the production. + +00:21:58 Yeah, for people who don't know, the vibe used to be run G Unicorn, but then with UVicorn workers, if you're doing async stuff. + +00:22:06 And then UVicorn kind of stepped up its game and said, you can actually treat us as our own app server. + +00:22:12 We'll manage lifecycle and stuff. + +00:22:14 And so that's the path you took, right? + +00:22:16 Yeah, exactly. + +00:22:16 Before that. + +00:22:17 Well, no, actually, before that, we didn't because our Django is fully synchronous. + +00:22:22 It doesn't do any async. + +00:22:24 So it was just bare metal G Unicorn. + +00:22:26 And it's still synchronous with just running it under UVcorn. + +00:22:30 But interestingly, still quite a bit faster in a few cases. + +00:22:34 We tried that out and we low tested it in a couple of scenarios + +00:22:38 and we found that it makes a lot of sense. + +00:22:41 But outside of that, I do have a lot of, well, very simplistic deployments that are also just systemd + +00:22:48 and a couple of Docker compose files and containers that are managed through some old coupled together Ansible things. + +00:22:59 But I think the oldest one that I have still running is from 2017. + +00:23:03 And it's been running without a change for like four or five years. + +00:23:07 That is awesome. + +00:23:08 I don't see a reason to do anything about it because the app works. + +00:23:11 It's being used productively. + +00:23:14 So why change anything about that? + +00:23:16 No need to introduce. + +00:23:17 Just don't touch it. + +00:23:18 Yeah, I was actually looking into Coolify that you two guys mentioned. + +00:23:22 I was thinking about, you know, maybe upgrading it to that, but I played around with it and I thought, well, why? + +00:23:28 You know, if I have to look into that deployment maybe once a year. + +00:23:31 So that's really nothing to gain for me to make it more complicated. + +00:23:36 David, Team Flask. + +00:23:38 I mentioned this before the show started, but I think I'm pretty sure I've said this the last time I was on Talk Python, + +00:23:45 but the projects I do for work typically have less than 100 users. + +00:23:51 And so my deployment is usually really simple. + +00:23:54 And usually they've chosen like Azure or AWS already. + +00:23:58 So we just have a Docker container and we put it on the relevant Docker container host + +00:24:03 in that service and it just works for them. + +00:24:05 We have a Postgres database and we have like Redis. + +00:24:08 But I never really had to deal with like scaling or that sort of stuff. + +00:24:13 But the funny thing is like, at least for my work, I'm always, we're often replacing older systems. + +00:24:19 And so even a single Docker container running a Flask application is way more performant and responsive than anything they're used to from like some 20 year old or 30 year old Java system. + +00:24:32 Right. And it can just respond on a small container with like a little bit of CPU and a little bit of memory. + +00:24:38 They're always shocked at like, how much do we need to pay for? + +00:24:41 Oh, just like it'll run a potato. + +00:24:44 You know, there's only 100 users and they're like, that's a lot of users. + +00:24:48 So my recommendation is always start small and then scale up from there. + +00:24:52 Don't try to overthink it ahead of time. + +00:24:55 Yeah, for my personal stuff, I'm using like Docker containers now and fly.io. + +00:24:59 I haven't gotten in. + +00:25:00 So I do want to look into Granian and Coolify, but I haven't gotten there yet. + +00:25:04 And for the Docker container, I can definitely recommend pythonspeed.org. + +00:25:09 I don't remember off the top of my head who writes that, but it's somebody in the Python + +00:25:13 ecosystem. + +00:25:14 And they have a whole series of articles on how to optimize your Docker container. + +00:25:18 And that sounds really complicated, but you end up with a Docker file that's like 20 lines + +00:25:22 long or something. + +00:25:23 So it's not like there's crazy things. + +00:25:26 It's just you have to know how to structure it. + +00:25:28 And then I just copy and paste that to the next project. + +00:25:30 Nice. + +00:25:30 Yeah. + +00:25:31 I resisted doing Docker for a long time because I'm like, I don't want that extra complexity. + +00:25:34 But then I realized the stuff you put in the Docker file is really what you just type in + +00:25:38 the terminal once and then you forget. + +00:25:41 I mean, always using Postgres, Redis, probably if I need some background. + +00:25:44 tasks, just plain SMTP server for email. I wrote for all three of those things. I wrote new + +00:25:51 extensions in the Flask ecosystem that I'm trying to get more people to know about now. So Flask + +00:25:56 SQLAlchemy Lite, L-I-T-E, instead of Flask SQLAlchemy, takes a much more lightweight approach to + +00:26:02 integrating SQLAlchemy with Flask. And then Flask Redis, I revived from like 10 years of + +00:26:08 non-maintenance. And then I wrote this whole system, this whole pluggable email system called + +00:26:12 email simplified, kind of inspired by Django, Django's pluggable system, except, and so there's + +00:26:18 like Flask email simplified to integrate that with Flask. But unlike Django, you can use email + +00:26:23 simplified in any library you're writing, in any Python application you're writing. It doesn't have + +00:26:27 to be a Flask web framework. It's pluggable as the library itself. And then you can also integrate + +00:26:32 it with Flask or something else. So Flask email simplified. I get like three downloads a month + +00:26:38 right now. So it needs some popularity. Awesome. I've been doing the non-simplified email lately. + +00:26:43 So I'm happy to hear that there might be a better way. Yeah. I think people do underappreciate just + +00:26:48 how much performance you got out of Python web apps. You know, they're like, oh, we're going to + +00:26:53 need to rewrite this and something else because the GIL or whatever. Like I decided just to make + +00:26:59 a point to pull up the tail till my log running court, by the way. And each one of these requests + +00:27:04 doing like multiple db calls and it's like 23 milliseconds six milliseconds three milliseconds + +00:27:10 you know nine milliseconds it's like that's good enough for that's a lot of requests per second + +00:27:16 per worker until you gotta you gotta have a lot of traffic speaking of court phil what's your take + +00:27:21 on this one i think it's very similar i also build docker containers and uh with a postgres database + +00:27:27 on the back end and i run hypercorn as the ascii server and put them behind a aws load balancer + +00:27:34 and just run them in ECS. + +00:27:36 And I think it's pretty simple, but I guess it depends on your biases. + +00:27:39 But yeah, that's all we do really. + +00:27:41 And it goes a long way. + +00:27:42 There are multiple ECS tasks, mostly because if one falls over rather than scaling, + +00:27:47 it's usually the database that you need to scale, I find. + +00:27:50 But yeah, that's how we run it. + +00:27:52 The nice thing for me about Hypercorn is that I can play with HTTP 3. + +00:27:56 So that's what we're doing at times. + +00:27:57 Oh, HTTP 3, okay. + +00:27:59 I've just been getting my HTTP 2 game down, so I'm already behind the game. + +00:28:03 What's the deal with HTTP/3? + +00:28:05 It's obviously a totally new way of doing it over UDP now rather than TCP. + +00:28:10 Although at the application level, you can't tell any difference really. + +00:28:13 But I mean, I just find it interesting. + +00:28:15 I'm not really sure it will help too much. + +00:28:17 And it's probably best if you've got users who have not that + +00:28:21 great a network connection. + +00:28:22 But for most other cases, I don't think it matters too much. + +00:28:25 Just keep blasting packets until some of them get through. + +00:28:29 OK, fine. + +00:28:30 We'll give you a page eventually. + +00:28:31 There's three pages, actually. + +00:28:32 All right, Sebastian, you are running not just FastAPI from your experience, but you're running FastAPI for a ton of people through FastAPI Cloud at, I'm sure, many different levels. + +00:28:43 This probably sounds like a shameless plug, and it kind of is, but it's sort of expected. + +00:28:48 I will deploy FastAPI or FastAPI Cloud. + +00:28:51 Just because, well, the idea is just to make it super simple to do that. + +00:28:54 You know, like if you are being able to run the command FastAPI run. + +00:28:58 So FastAPI run has like the production server that is using Ubicorn underneath. + +00:29:03 And if you can run that, then you can run also FastAPI deploy. + +00:29:06 And then like, you know, like it will most probably just work. + +00:29:10 And, you know, we just wrap everything and like deploy, + +00:29:13 build, install, deploy, handle HTTPS, all the stuff without needing any Docker file or anything like that. + +00:29:19 And I think for many use cases, it's just like simpler being able just to do that. + +00:29:23 There are so many projects that I have been now building, + +00:29:25 like random stuff that is not really important, but now I can. + +00:29:30 And before it was like, yeah, well, I know how to deploy this thing like fully with like + +00:29:34 all the bells and whistles, but it's just so much work that yeah, I know later. + +00:29:38 So for that, I would end up just like going with that. + +00:29:41 Now if I didn't... + +00:29:42 Well, what I was going to ask is how much are you willing to tell us how things run inside + +00:29:47 FastAPI Cloud? + +00:29:48 Oh, I can't, it's just so much stuff that is going on. + +00:29:52 And it's also, it's fun that nowadays that they're like, + +00:29:57 we have Docker and we have Docker Swarm and there was Nomad + +00:30:00 and Kubernetes and oh, Kubernetes won. + +00:30:02 And then we have the cloud providers and there's AWS and Google and Azure. + +00:30:08 And you will expect that all these things and all this complexity is like, now that it's like, okay, + +00:30:14 these are the clear winners. + +00:30:15 So it's like a lot of complexity to take on, but once you do it all works, but it doesn't. + +00:30:21 And it's just like so much work to get things to work together, to work correctly. + +00:30:27 And the official resources from the different providers and things, + +00:30:32 in many cases, it's like, oh, the solution is hidden in this issue somewhere in GitHub + +00:30:36 because the previous version was obsolete, but now the new version of this package or whatever is like, it's just, it's crazy. + +00:30:43 But like, yeah, so if I didn't have FastAPI Cloud, I will probably use containers. + +00:30:50 I will probably use Docker. + +00:30:51 If it's like something simple, I will deploy with Docker Compose, + +00:30:55 probably try to scale minimum replicas. + +00:30:57 I don't remember Docker Compose has that. + +00:30:59 I remember that Docker Swarm had that, but then Docker Swarm sort of lost against where Net is. + +00:31:05 I will put a traffic load balancer in front to handle HTTPS and, yeah, well, like regular load balancing. + +00:31:12 And, yeah, just regular YubiCorn. + +00:31:14 What some of the folks we were talking about before, At some point, we needed to have Unicorn on top of Uvicorn + +00:31:22 because Uvicorn wouldn't be able to handle workers. + +00:31:25 But now Uvicorn can handle its workers like everything + +00:31:27 and handle the main thing was some VE processes and reaping the processes and handling the stuff. + +00:31:34 Now it can't just do that. + +00:31:35 So you can just run plain Uvicorn. + +00:31:37 So if you're using FastAPI and you say FastAPI run, that already does that. + +00:31:41 So if you're deploying on your own, you can just use the FastAPI run command. + +00:31:44 Then, of course, you have to deal with the scaling and HTTPS and a lot of balancing + +00:31:48 and all the stuff, but the core server, + +00:31:51 you can just run it directly. + +00:31:53 If going beyond that, then there will probably be + +00:31:56 some cluster Kubernetes and trying to scale things, + +00:32:00 figure out the ways to scale things based on the load of the requests, + +00:32:05 like scaling automatically. + +00:32:08 Having normally one container per process to be able to scale that more dynamically + +00:32:13 without depending on the local memory for each one of the servers + +00:32:15 and things like that, I'm probably saying too much. + +00:32:17 But yeah, actually, you know, like if I didn't have a CPI cloud, + +00:32:20 I will probably use one of the providers that abstract those things a little bit away, + +00:32:27 you know, like render, railway, fly, like, I don't know. + +00:32:31 Like, I don't really think that a regular developer should be dealing with, + +00:32:36 you know, like the big hyperscalers and like Kubernetes + +00:32:39 and like all that complexity for a common app. + +00:32:42 Most of the cases, I think it's just really too much complexity to real with. + +00:32:47 It's kind of eye-watering to open up the AWS console or Azure or something. + +00:32:52 Whoa. + +00:32:52 Oh, the other day, you know, like the other day I had to, in one of the AWS accounts, I had to change the account email. + +00:32:59 I think I spent four hours. + +00:33:01 I know. + +00:33:01 Because I had to create the delegate account that has the right permissions to roll. + +00:33:05 And they're like, oh, no, this is, you know, like, sometimes it's just overwhelming the amount of complexity + +00:33:11 that needs to be dealt with. + +00:33:13 And, yeah, I mean, it's great to really have, like, you know, like the infra people that I have working with me + +00:33:20 at the company that I can deal with all that mess and, like, can make sure that everything is just running perfectly + +00:33:26 and it just works. + +00:33:27 So it's like, you know, like, sort of SRE as a service, + +00:33:30 DevOps as a service for everyone. + +00:33:32 It's like a cloud product that provides DevOps as a service, + +00:33:36 I spent a number of years doing nothing but cloud migrations to these hyperscalers for enterprises. + +00:33:42 And I can tell you that when you mentioned the eye-watering comment about the network and all that stuff, + +00:33:48 it's so incredibly complicated now, right? + +00:33:49 There's literally every kind of concept that you need to know to deploy these enterprises now, + +00:33:55 move them from on-prem to the cloud. + +00:33:56 So it does get incredibly complicated. + +00:33:58 Having something simple like what Sebastian is talking about, I think, is super helpful + +00:34:02 when you're just trying to get started and get something up and running quickly. + +00:34:06 I've got a lot of questions and I realize that we will not be getting through all of them. + +00:34:10 So I want to pick carefully. + +00:34:12 So let's do this one next. + +00:34:15 Performance, what's your best low effort tip? + +00:34:18 Not like something super complicated, but I know there's a bunch of low hanging fruit + +00:34:23 that people maybe missed out on. + +00:34:26 And this time let's start with Litestar. + +00:34:28 Cody, back at you. + +00:34:29 I'm going to stick to what I know, which is databases because I deal with that. + +00:34:32 every single day. There's a couple of things that I see as like gotchas that I constantly see over + +00:34:38 and over. One, SQLAlchemy kind of obfuscates the way it's going to execute things and what kind of + +00:34:45 queries it's going to actually execute. So it's really easy if you're not kind of fluent in how + +00:34:49 it works to create N plus one types of issues. And so when people start talking about sync or async, + +00:34:55 it's really, in my mind, it's less of that because you're going to spend more time waiting on the + +00:34:59 network and database and those kind of things, then you're going to spend serializing just + +00:35:04 generally, right? And or processing things on the web framework. And so, one, making sure that you, + +00:35:10 your relationships dialed in correctly so that you don't have N plus one queries. The other thing is + +00:35:15 oversized connection pooling into Postgres and just databases in general, because what people don't + +00:35:21 tend to know is that each of those connections takes up CPU cycles and RAM of the database. + +00:35:26 And so when you slam the database with hundreds of connections, you're just taking away processing power that can be done for other things, right? + +00:35:33 And so you end up ultimately slowing things down. + +00:35:35 So I've seen databases that have had so many connections that all of the CPU and all the stuff is actually doing things, just managing connections and can't actually do any database work. + +00:35:44 And so what about this socket? + +00:35:45 Is it busy? + +00:35:46 What about this socket? + +00:35:46 Is it busy? + +00:35:47 It's just round robin that, right? + +00:35:48 Paying attention to the database is kind of my first kind of rule of thumb. + +00:35:52 100%. + +00:35:52 I like that one a lot. + +00:35:53 I'll throw in putting stuff or identifying work that doesn't need to be done immediately for the user and putting in a background task. + +00:36:02 Having a background worker defer things till later. + +00:36:05 So sending email is an example, although there's nuances there about knowing that it's sent and everything. + +00:36:10 But yeah, if you user kicks off some process and then you wait to do that process in the worker, you're holding that worker up, which is more relevant in WSGI than ASGI. + +00:36:20 but and you're making them wait for their page to load again versus record what they wanted to do + +00:36:26 send it off to the background let them see the status of it but let the background worker handle + +00:36:30 it all right yeah like i said as you guys go for it i'm not sure if that's some sort of + +00:36:35 it's not really a trick or a tip or more more like a i think the most common mistake i see when i + +00:36:41 is ascii specific but when i look at ascii apps that people have written who are maybe not as + +00:36:46 familiar with ASCII or async Python at all, if you make something an async function, you should be + +00:36:53 absolutely sure that it's non-blocking. Because if you're running an ASCII app and you're blocking + +00:36:59 anywhere, your whole application server is blocked completely. It doesn't handle any other requests + +00:37:04 at the same time. It's blocked. I don't think I've had any mistake more times when I've looked through + +00:37:11 some apps that someone has written or that i've came across somewhere so this is really it's super + +00:37:18 super common and it has such a such a big impact on the overall performance in every every metric + +00:37:26 imaginable so i would say unless and that's nowadays what i tell people unless you're 100 + +00:37:33 sure that you know what you're doing and you know it's it's non-blocking don't make it async put it + +00:37:39 in a thread pool, execute it in a thread, whatever. + +00:37:42 All of the ASCII frameworks and Django give you a lot of tools at hand to translate your stuff + +00:37:49 to from sync to async so you can still run it. + +00:37:52 Do that unless you're very sure that it actually fully supports + +00:37:57 async. + +00:37:57 MARK MANDEL: Yeah, that's good advice. + +00:37:58 Sebastian. + +00:37:59 SEBASTIAN BASTIAN: Hey, I'm actually going to second, Yannick. + +00:38:02 I think, yeah, like it's-- + +00:38:04 and it's maybe counterintuitive that one of the tips of performance is to try to not optimize that much performance at the beginning. + +00:38:13 You know, like, I think the idea with async is like, oh, you can get so much performance + +00:38:17 and throughput in terms of accuracy, whatever. + +00:38:20 But the thing is, in most of the cases, you know, like, till apps grow so large, they + +00:38:26 actually don't need that much extra throughput, that much extra performance. + +00:38:30 And in a framework like, you know, like, as Yannick was saying, well, in my case, I know + +00:38:35 FastAPI, but like, you know, like also many others. + +00:38:38 If you define the function with async, it's going to be run async. + +00:38:41 If you define it non-async and regular def, it's going to be run on a thread worker automatically. + +00:38:46 So it's just going to do the smart thing automatically. + +00:38:50 So it's like fair, you know, like it's going to be good enough. + +00:38:54 And then you can just start with that and just keep blocking code everywhere. + +00:38:58 You know, like just not use async until you actually know + +00:39:01 that you really need to use async. + +00:39:03 And once you do, you have to be, as Yannick was saying, + +00:39:05 you know, like 100% sure that you are not running blocking code inside of it. + +00:39:11 And if you need to run blocking code inside of Async code, + +00:39:14 then make sure that you are sending it to a thread worker. + +00:39:17 Sending it to a thread worker sounds the own thing, but yeah, like, you know, like Django has tools, + +00:39:23 any IO has tools. + +00:39:23 I also built something on top of any IO called AsyncR, + +00:39:27 that is just to simplify these things, to asyncify a blocking function, + +00:39:31 keeping all the type information so that you get autocompletion and inline errors and everything. + +00:39:35 even though it's actually doing all the stuff of sending the thing to the thread work. + +00:39:40 So the code is super simple. + +00:39:42 You keep very simple code, but then underneath it's just like doing + +00:39:45 all the stuff that should be done. + +00:39:46 But you know, like that's normally when you actually need to hyper-optimize things. + +00:39:51 In most of the cases, you can just start with just not using async at first. + +00:39:55 Also, now that you're going to have Python multi-threaded, + +00:39:58 then suddenly you're going to have just so much more performance out of the blue + +00:40:02 without even having to do much more. + +00:40:05 So, yeah, actually that's, you know, like, sorry, I kept speaking so much, but here's a tip for improving performance. + +00:40:12 Upgrade your Python version. + +00:40:14 I was just chatting today with Savannah. + +00:40:17 She was adding the benchmarks to the, you know, like Python benchmark, the official Python benchmarks that they run for the CPython, the faster CPython program. + +00:40:29 And the change from Python 3.10 to Python 3.14 when running FastAPI is like almost double the performance + +00:40:40 or something like that. + +00:40:40 It was like, it was crazy. + +00:40:42 It was just crazy improvement in performance. + +00:40:44 So you can just upgrade your Python version. + +00:40:46 You're gonna get so much better performance just out of that. + +00:40:50 - Yeah, that's an awesome piece of advice that I think is often overlooked. + +00:40:53 And it's not only CPU speed, it's also memory gets a lot lower. + +00:40:57 Whoever's gonna jump in, go ahead. + +00:40:58 Last year, I was looking at MarkupSafe, which is an HTML escaping library that we use and + +00:41:04 has a C extension for speedups. + +00:41:05 And I almost convinced myself that I can stop maintaining the C extension because just Python + +00:41:11 itself got way faster. + +00:41:13 But then it turned out that I could do something to the C extension to make it faster also. + +00:41:17 So I'm still maintaining. + +00:41:18 But just the fact that I almost convinced myself like, oh, I can drop a C extension for just + +00:41:23 a Python upgrade instead was pretty impressive. + +00:41:26 They've done a lot, especially with like string handling and, you know, which you're going to use + +00:41:30 for templating for web apps. + +00:41:32 Phil. + +00:41:32 Yeah, well, I definitely echo looking at your DB queries because by far and large, that's always where + +00:41:38 our performance issues have been. + +00:41:39 It's either badly written query or we're returning most of the database when the user just wants to know + +00:41:44 about one thing or something silly like that. + +00:41:46 I was thinking about low-hanging ones, which I think you asked about. + +00:41:48 So I'd say uv loop, which is still a noticeable improvement. + +00:41:53 And also, because I think it's likely a lot of us are returning JSON, often changing the + +00:41:59 JSON serializer to one of the faster ones can be noticeable as well and obviously quite easy to do. + +00:42:04 So yeah, that's my key. + +00:42:05 That's really good advice. + +00:42:06 I didn't think about the JSON serializer. + +00:42:08 What one do you recommend? + +00:42:09 I think, is it you, JSON? + +00:42:11 Or is it all JSON? + +00:42:12 I can't remember which one was deprecated. + +00:42:15 But yeah, if you look at the Tech Empower benchmarks, everyone's changing the JSON serializer + +00:42:21 to get that bit extra speed. + +00:42:22 But yeah, you're like, our framework looks bad because our JSON serializer is like third + +00:42:27 of the performance. + +00:42:28 We changed, well, David added a JSON provider to Flask. + +00:42:31 And yeah, you could see it make a difference in the tech and power benchmarks. + +00:42:35 So that was really good. + +00:42:36 Yeah, cool. + +00:42:36 Yeah, it's pluggable now. + +00:42:37 But if you're not installing Flask or JSON, I mean, I don't know what other JSON library + +00:42:43 you'd be using at this point, unless you're already using one. + +00:42:45 But or JSON is very, very fast. + +00:42:47 Okay, this is something I'm going to be looking at you later. + +00:42:49 So over to Django, Jeff, David talked about running stuff in the background and was it Django 5 or Django 6 that got the background task thing? + +00:42:57 Yeah, Django 6 just came out a couple of weeks ago. + +00:43:00 And I'll hand that off to Carlton in a second because I think Carlton's had more to do with the actual plumbing being on the steering council. + +00:43:07 My advice to people is the best way to scale something is just to not do it, avoid the process completely. + +00:43:12 So like I mentioned to CDN earlier, it's content heavy sites, cache the crap out of stuff. + +00:43:16 It doesn't even have to hit your servers. + +00:43:17 You can go a lot, as we mentioned earlier, too, just by doubling the amount of resources a project has. + +00:43:22 Django is pretty efficient these days, especially with async views. + +00:43:25 Like everybody else has said, too, any blocking code, move off to threads, move off to a background queue. + +00:43:31 Django Q2 is my favorite one to use because you can use a database. + +00:43:35 So for those little side projects where you just want to run one or two processes, you can use it. + +00:43:39 It works great. + +00:43:40 And Carlton, if you want to talk about Django internals. + +00:43:43 Yeah, OK. + +00:43:43 So the new task framework I just mentioned, the main thing, the main sort of bit about it is that it's, again, this pluggable Django API. + +00:43:51 So it gives a standard task API. + +00:43:53 So if you're writing a third-party library and you, I know, you need to send an email. + +00:43:57 It's the canonical example, right? + +00:43:58 You need to send an email in your third-party library. + +00:44:01 Before, you'd have had to tie yourself to a specific queue implementation, whereas now Django is providing a kind of like an ORM of tasks. + +00:44:07 Right, right. + +00:44:08 You got to do Redis, you got to do Celery, and you got to manage things and all that. + +00:44:11 You don't have to pick that now as the third-party package author. + +00:44:14 You can just say, right, just use Django, wrap this as a Django task and queue it. + +00:44:18 And then the developer, when they come to choose their backend, + +00:44:22 if they want to use Celery or they want to use Django Q2 + +00:44:25 or they want to use the Django task backend, which Jake Howard, who wrote this for Django provided as well, + +00:44:30 you can just plug that in. + +00:44:32 So it's a pluggable interface for tasks, which is, I think, the really nice thing about it. + +00:44:37 In terms of quick wins, everybody's mentioned almost all of mine. + +00:44:40 I'm going to, Cody and Phil, they mentioned the database. + +00:44:43 That's the big one. + +00:44:44 Django, the ORM, because it does lazy related lookups, + +00:44:48 it's very easy to trigger in M plus one where, you know, + +00:44:51 the book has multiple authors and suddenly you're iterating through the books + +00:44:55 and you're iterating through the authors and it's a lookup. + +00:44:57 So you need to do things like prefetch related, select related. + +00:45:00 You need to just check that you've got those. + +00:45:02 Django debug toolbars are a great thing to run in development + +00:45:05 where you can see the queries and it'll tell you where you've got the duplicates. + +00:45:08 And then the slightly bigger one is to just check your indexes. + +00:45:11 The ORM will create the right indexes if you're leaning, + +00:45:14 if you're going through primary keys or unique fields. + +00:45:16 But sometimes you're doing a filter on some field, and then there's not the right index there, + +00:45:21 and that can really slow you down. + +00:45:22 So again, you can do the SQL explain on that and find that. + +00:45:26 And then the thing I was going to say originally was caching, + +00:45:30 is get a Redis instance, stick it next to your Django app, + +00:45:34 and as Jeff said, don't do the work. + +00:45:36 If you're continually rendering the same page and it never changes, + +00:45:40 cache it and pull it from the cache rather than rendering. + +00:45:43 Because template DB queries are one of your biggest things. + +00:45:45 The second one's always going to be serialization. + +00:45:47 It's either serialization or template rendering. + +00:45:49 So if you can avoid that by caching, you can save an awful lot of time on your account. + +00:45:53 Yeah. + +00:45:54 I was wondering if somebody would come back with database indexes, + +00:45:56 because that's like a 100x multiplier for free almost. + +00:46:01 It's such a big deal. + +00:46:03 It really can be. + +00:46:03 If you're making a particular query and it's doing a full table scan + +00:46:06 all of a sudden you put the index in, it's instant. It's like, oh, wow. You don't have to be a DBA or + +00:46:12 master information architect sort of thing. I don't know about Postgres. I'm sure it has it. + +00:46:16 Somebody can tell me. But with Mongo, you can turn on in the database, I want you to log all + +00:46:21 slow queries and slow for me means 20 millisecond or whatever. Like you put a number in and then + +00:46:27 you run your app for a while and you go, look at what's slow by slowest. And then you can see, + +00:46:31 well, maybe that needs an index, right? Like just let your app tell you what you got to do. + +00:46:35 Yeah, there is a post. + +00:46:37 I'm just trying to see if I can quickly look it up now. + +00:46:38 There's a Postgres extension, which will automatically run explain + +00:46:42 on the slow queries and log them for you. + +00:46:44 So it'll... + +00:46:45 There you go. + +00:46:46 See if I can find... + +00:46:47 It's pgstat statements, I think, is what you're thinking about. + +00:46:49 Right, okay. + +00:46:49 If you're unsure about your database indexes, do this, or at least go back and review your queries. + +00:46:55 Yeah, I agree. + +00:46:56 Very good. + +00:46:57 All right, I can see we're blazing through these questions. + +00:47:00 I had one more. + +00:47:01 If I can mention one. + +00:47:01 No, please go ahead. + +00:47:02 Yeah, go ahead, David. + +00:47:03 If you want to like get some more responsive parts of your website, like make your website a little more responsive or interactive with the user, HTMX or Datastar, especially like if you're using Cort or another ASGI where you can do SSE, server sent events or WebSockets, streaming little bits of changes to the web front end and then rendering them with the same HTML you're already writing can make things a lot more responsive. + +00:47:28 We had talk about that from Chris May at FlaskCon last year, which you can find on YouTube. + +00:47:33 This is not one of the questions, but let me just start out for a quick riff on this, folks. + +00:47:38 Out in the audience, someone was asking, what about HTMX? + +00:47:41 And I think more broadly, I am actually a huge fan of server-side-based, template-based apps. + +00:47:48 I think it just keeps things simpler in a lot of ways, unless you need a lot of interactivity. + +00:47:51 But things like HTMX or a little bit of JavaScript can reduce a lot of the traffic and stuff. + +00:47:57 Where do people land on those kinds of things? + +00:47:59 I absolutely love HTMLX, not just because you don't have to write a lot of JavaScript or whatever, + +00:48:06 but mostly because if I'm just building a simple app that needs a bit more than just be a static HTML page, + +00:48:14 it needs some interactivity, a little bit of reactivity. + +00:48:18 I feel like having the whole overhead of building an SPA or whatever tools you need for the whole JavaScript, TypeScript, whatever stack, it's just so much work to get a little bit to make a simple thing a little bit nicer, a little bit more reactive. + +00:48:34 And I feel like HTMX just fits right in there. + +00:48:37 It's super great. + +00:48:39 I've built a couple of things with it now, a few of my own projects, a few things that work. + +00:48:45 And it makes things so much easier where the work probably wouldn't have been done + +00:48:50 if it was just because it's too much. + +00:48:52 If you're doing a whole front end thing that you have then to deploy and build and whatever, + +00:48:57 or it would have been less nice. + +00:48:59 So it's an amazing, really amazing thing. + +00:49:02 As the maintainer and author, though, one of the things that is not frustrating, but it's understandable is that HTMX is not for everybody, right? + +00:49:10 There's not like you can't use HTMX in all occasions or Datastar, right? + +00:49:15 And so there are people that are always going to want to use React and there's going to be people that want to use all these other frameworks. + +00:49:20 And so having some cohesive way to make them all talk together, I think, is important. + +00:49:24 I don't have that answer yet, but I just know that like I can't always say HTMX is it, right? + +00:49:29 And then you'll have a great time because I'll inevitably meet somebody that says I need to do this. + +00:49:33 And it's right. And it's in a single page application or something is more appropriate for that. + +00:49:37 And so it's obviously the right tool for the right job when you need it. + +00:49:41 But, you know, I want to make something that is cohesive depending on whatever library you want to use. + +00:49:45 I would throw one thing in there, though. + +00:49:47 I would rather somebody start with HTMX than I would start with React if you don't need it. + +00:49:51 Because React can be total overkill. It can be great for some applications. + +00:49:54 But oftentimes the consultant, we see people like having an about page and they throw a React at it. + +00:49:59 Like, why do you need that? + +00:50:00 Like, especially for small things with partials. + +00:50:02 Do you mean you don't want to start with Angular? + +00:50:03 You know, it's fine if you need it, but I don't think you really need it. + +00:50:07 Like, introduce tools as you need them. + +00:50:10 Django 6.0 just added template partials, and I guess my job here is to hand off to Carlton + +00:50:14 because this is his feature. + +00:50:15 Yeah, I was happy to see that come in there, Carlton. + +00:50:17 Nice job. + +00:50:17 No, it's okay. + +00:50:19 Plug the new feature. + +00:50:20 So, I mean, I stepped down as a fellow in 2023 into a new business, + +00:50:25 and I read the essay about template fragments on the htmx website where it's this like named reusable bits in the in the templates and i was + +00:50:34 like i need that so i built django template partials released as a third-party package and it's now just + +00:50:38 been merged into core for django 6.0 and i have to say about htms it's really changed the way i write + +00:50:44 websites before i was the fellow i used to write mobile applications and do the back do the front + +00:50:49 end of the mobile application then the back end in django using django rest framework and i was + +00:50:52 that's how i got into you know open source was by django rest framework and since starting the + +00:50:58 we're three years in we've hardly got a json endpoint in sight it's like two three four of + +00:51:02 them in the whole application and it's oh it's just a delight again you know you asked me at the + +00:51:08 beginning michael am i having fun yeah i really am having fun and htmx is the reason i do grant + +00:51:12 there are you know these use cases awesome all right let's talk about our last topic and we have + +00:51:18 five minutes ish to do that so we gotta we gotta stay on target quick but let's just go around + +00:51:24 run real quick here we talked about upgrading the python version getting better performance out of + +00:51:29 it i mentioned the lower memory side but i think one of the underappreciated aspects of this you + +00:51:36 know the instagram team did a huge talk you know on it a while ago is the memory that you run into + +00:51:43 when you start to scale out your stuff on the server because you're like oh i want to have four + +00:51:47 workers so i can have more concurrency because of the gills so now you've got four copies of + +00:51:51 everything that you cache in memory and just like the runtime and now you need eight gigs instead of + +00:51:56 what would have been one or who knows right but with free threaded python coming on which i've + +00:52:02 seen a couple of comments in the chat like hey tell us about this like it's we could have true + +00:52:08 concurrency and we wouldn't need to scale as much on the process side i think giving us both better + +00:52:13 performance and the ability to say well you actually have four times less memory so you could + +00:52:17 run smaller servers or whatever what's the free threaded story for all the frameworks carlton + +00:52:22 let's go back to you for do it in reverse i'm really excited about it i don't know how it's + +00:52:26 going to play out but i'm really excited about it all it can do is help django the async story in + +00:52:31 django is nice and mature now uh but still most of it's sync like you know you're still going to + +00:52:36 default the sync you're still going to write your sync views you still got template rendering you + +00:52:39 know django's template template based kind of framework really you're still going to want to + +00:52:43 run things synchronously, concurrently, and proper threads are going to be, yeah, they can't but help. + +00:52:50 I don't know how it's going to roll out. I'll let someone else go because I'm getting locked up. + +00:52:53 Yeah, I just like elaborated that for people out there before we move on is you could set up your + +00:52:59 worker process to say, I want you to actually run eight threads in this one worker process. + +00:53:05 And when multiple requests come in, they could both be sent off to the same worker to be processed. + +00:53:10 And that allows that worker to do more unless the GIL comes along and says, stop, you only + +00:53:15 get to do one thing in threads in Python. + +00:53:17 And all of a sudden, a lot of that falls down. + +00:53:19 This basically uncorks that and makes that easy all of a sudden. + +00:53:22 Even if you yourself are not writing async, your server can be more async. + +00:53:26 Yeah. + +00:53:26 And this is the thing that we found with ASCII is that you dispatch to a, you know, using + +00:53:31 to async or you dispatch it to a thread, a thread pool executor, but Python doesn't run + +00:53:36 that concurrently. + +00:53:37 And so it's like, or in parallel. + +00:53:39 So it's like, ah, it doesn't actually go as fast as you want it to. + +00:53:42 And so you end up wanting multiple processes still. + +00:53:45 All right, let's keep it with Django. + +00:53:46 Jeff, what do you think? + +00:53:47 I'm going to defer to the others on this. + +00:53:48 I have the least thoughts. + +00:53:49 All right, write down the stack, Sebastian. + +00:53:51 Write down the video, not website, web framework. + +00:53:55 I think it's going to be awesome. + +00:53:56 This is going to help so much, so many things. + +00:53:59 The challenge is going to be third-party libraries used by each individual application + +00:54:05 and if they are compatible or not. + +00:54:07 That's where the challenge is going to be. + +00:54:08 But other than that, it's just going to be free extra performance for everyone. + +00:54:13 Just, you know, like just upgrading the version of Python. + +00:54:15 So that's going to be us. + +00:54:16 Cody. + +00:54:16 Yeah, I'm going to echo what Sebastian just said. + +00:54:18 The third party libraries, I think, are going to be the big kind of sticky point here. + +00:54:21 I'm looking forward to seeing what we can do. + +00:54:23 I'm going to kind of hold my thoughts on that. + +00:54:24 Yannick kind of speak a little bit on it because I know that he's looked at msgspec specifically + +00:54:28 and some of the other things that might, you know, give some better context here. + +00:54:31 But yes, the third party libraries are going to be the kind of the sticky issue. + +00:54:36 but I'm looking forward to seeing what we can make happen. + +00:54:38 I'm super excited, actually, specifically about async stuff, + +00:54:43 because for most of the time, it's like, if you can already saturate your CPU, + +00:54:50 async doesn't help you much. + +00:54:51 Well, now, if you have proper threads, you can actually do that in async as well. + +00:54:56 And I think it's going to speed up a lot of applications + +00:55:00 just by default, because almost all async applications out there + +00:55:06 use threads in some capacity because, well, most of things aren't async by nature. + +00:55:12 So they will use a thread pool and it will run more concurrently. + +00:55:16 And so that's going to be better. + +00:55:18 But I'm also a bit scared about a few things that mainly, as a few others have said now, + +00:55:26 third-party libraries extension is specifically those that are Python C extensions. + +00:55:33 just recently, I think like three weeks ago, so got a msgspec released for Python 3.14 + +00:55:40 and proper free threading support. + +00:55:42 And that took a lot of work. + +00:55:44 Fortunately, a few of the Python core devs chimed in and contributed to PRs + +00:55:49 and helped out with that. + +00:55:52 And all around the ecosystem, the last few years, + +00:55:55 there's been a lot of work going on. + +00:55:57 But especially for more niche libraries that are still here and there, + +00:56:02 I think there's still a lot to do and possibly also quite a few bugs lurking here and there + +00:56:09 that haven't been found or are really hard to track down. + +00:56:13 I'm curious and a bit maybe scarce, too hard of work, but I'm cautious. + +00:56:18 It's going to be a little bit of a bumpy ride as people turn that on + +00:56:22 and then the reality of what's happening. + +00:56:24 However, I want to take Cody's warning and turn it on its head + +00:56:28 about these third-party libraries, Because I think it's also an opportunity for regular Python developers who are not async fanatics to actually capture some of that capability. + +00:56:39 Say if some library says, hey, I realize that if we actually implement this lower level thing, you don't actually see the implementation of in true threading. + +00:56:48 And then you use it, but you don't actually do threading. + +00:56:50 You just call even a blocking function. + +00:56:52 You might get a huge performance boost, a little bit like David was talking about with markup safe. + +00:56:57 And you could just all of a sudden, with doing nothing + +00:57:00 with your code, goes five times faster on an eight core + +00:57:04 machine or something in little places where it used to matter. + +00:57:06 I'm super excited for-- + +00:57:09 we currently focused on the things that are out there + +00:57:12 right now and that might need to be updated. + +00:57:15 But I'm super excited for what else might come of this, + +00:57:19 new things that will be developed or stuff that we are currently not thinking about + +00:57:25 or have that hadn't been considered for the past 30 years or so, + +00:57:29 because it just wasn't feasible or wasn't possible or didn't make sense at all. + +00:57:33 I think it would pay off definitely. + +00:57:36 All right. Team Flask. + +00:57:37 You guys got the final word. + +00:57:39 I think it would probably be more advantageous to whiskey apps + +00:57:42 than it will for ASCII apps. + +00:57:44 And when I've been playing with it, it's mostly on the whiskey flask side + +00:57:47 where I'm quite excited about it. + +00:57:48 At the same time, like the others are a bit worried because not clear to me, + +00:57:52 for example, that green threading is going to work that well + +00:57:55 with free threading. + +00:57:56 And that may have been fixed, but I don't think it has yet. + +00:57:58 And that might then break a lot of whiskey apps. + +00:58:02 So next, I think. + +00:58:03 But yeah, very excited for Flask in particular. + +00:58:06 Thanks for bringing up green threading. + +00:58:07 I added that to my notes of mention right now. + +00:58:11 So Flask already has emphasized for years and years and years + +00:58:16 that don't store stuff globally, don't have global state, + +00:58:19 bind stuff to the request response cycle if you need to store stuff, + +00:58:23 look stuff up from a cache otherwise. + +00:58:24 And my impression is that that emphasis is pretty successful. + +00:58:27 I don't, there's any well-known extensions using like global state or anything like that. + +00:58:31 It's helped that the dev server that we have is threaded by default. + +00:58:36 Like it's not going for performance, obviously, it's just running on your local machine, but + +00:58:39 it's already like running in a threaded environment, running your application in a + +00:58:42 threaded environment, not a process-based one by default. + +00:58:45 I don't know if anybody even knows that you can run the dev server as process-based. + +00:58:49 And we also already had for a decade or more than a decade, + +00:58:53 Gevent to enable the exact same thing that free threading is enabling for Flask, + +00:58:59 which is concurrent work and connections. + +00:59:03 And so plenty of applications are already deployed that way + +00:59:06 using Gevent to do what kind of ASCII is enabling. + +00:59:10 I've run all the test suites with pytest FreeThreaded, which checks that your tests can run concurrently in the free threaded builds. + +00:59:18 so go check that out by Anthony Shaw. + +00:59:20 And I'm pretty sure Granian already supports free-threading. + +00:59:23 Not sure though, I haven't looked into Granian enough. + +00:59:25 But like- + +00:59:26 You know, I'm not sure either. + +00:59:27 It does have a runtime threaded mode but I don't know if that's truly free-threaded or not. + +00:59:32 All of those things combined make me pretty optimistic that Flask will be able to take advantage of this + +00:59:38 without much work from us. + +00:59:40 I mean, I know that's a big statement right there and I haven't tested it + +00:59:43 but the fact that we've emphasized all these different parts for so long already + +00:59:47 makes me confident about it. + +00:59:48 I'm also super excited about it. + +00:59:49 And just one final thought I'll throw out there before we call it a show, + +00:59:52 because we could go on for much longer, but we're out of time. + +00:59:55 I think once this comes along, whatever framework out of this choice you're using out there, + +01:00:01 there's a bunch of inner working pieces. + +01:00:04 One of them may have some kind of issue. + +01:00:06 And I think it's worth doing some proper load testing + +01:00:08 on your app, you know, point something like locus.io at it + +01:00:12 and just say, well, what if we gave it 10,000 concurrent users + +01:00:14 for an hour? + +01:00:15 Does it stop working? + +01:00:16 Does it crash? + +01:00:17 Or does it just keep going? + +01:00:18 You're like, so that seems like a pretty good thing to do the first time before you deploy + +01:00:22 your first free threaded version. + +01:00:23 Yeah. + +01:00:24 All right, everyone. + +01:00:25 I would love to talk somewhere. + +01:00:26 This is such a good conversation, but I also want to respect your time and all that. + +01:00:31 So thank you for being here. + +01:00:32 It's been an honor to get you all together and have this conversation. + +01:00:35 Thank you very much for having us. + +01:00:37 Thank you. + +01:00:37 Yeah. + +01:00:37 Thanks for having us all. + +01:00:38 Thanks, everybody. + +01:00:39 Yeah. + +01:00:39 It's nice being here. + +01:00:40 Yeah. + +01:00:40 Thanks for having us. + +01:00:41 Thanks for having us all. + +01:00:42 Bye. + +01:00:42 Bye-bye. + +01:00:44 This has been another episode of Talk Python To Me. + +01:00:47 Thank you to our sponsors. + +01:00:48 Be sure to check out what they're offering. + +01:00:49 It really helps support the show. + +01:00:51 If you or your team needs to learn Python, we have over 270 hours of beginner and advanced courses + +01:00:57 on topics ranging from complete beginners to async code, + +01:01:00 Flask, Django, HTMX, and even LLMs. + +01:01:03 Best of all, there's no subscription in sight. + +01:01:06 Browse the catalog at talkpython.fm. + +01:01:09 And if you're not already subscribed to the show on your favorite podcast player, what are you waiting for? + +01:01:14 Just search for Python in your podcast player. + +01:01:16 We should be right at the top. + +01:01:17 If you enjoyed that geeky rap song, you can download the full track. + +01:01:20 The link is actually in your podcast blog or share notes. + +01:01:23 This is your host, Michael Kennedy. + +01:01:24 Thank you so much for listening. + +01:01:26 I really appreciate it. + +01:01:27 I'll see you next time. + +01:01:38 I started to meet. + +01:01:40 And we're ready to roll. + +01:01:43 Upgrade the code. + +01:01:45 No fear of getting whole. + +01:01:48 We tapped into that modern vibe over King Storm. + +01:01:53 Talk Python To Me, I-Sync is the norm. + +01:02:24 Редактор субтитров А.Семкин Корректор А.Егорова + diff --git a/transcripts/533-web-frameworks-in-prod-by-their-creators.vtt b/transcripts/533-web-frameworks-in-prod-by-their-creators.vtt new file mode 100644 index 0000000..c322bf6 --- /dev/null +++ b/transcripts/533-web-frameworks-in-prod-by-their-creators.vtt @@ -0,0 +1,3638 @@ +WEBVTT + +00:00:00.020 --> 00:00:04.940 +Today on Talk Python, the creators behind FastAPI, Flask, Django, Quart, and Litestar + +00:00:05.500 --> 00:00:09.740 +get practical about running apps based on their frameworks in production. + +00:00:10.580 --> 00:00:15.060 +Deployment patterns, async gotchas, servers, scalings, and the stuff that you only learn + +00:00:15.060 --> 00:00:16.900 +at 2 a.m. when the pager starts going off. + +00:00:17.460 --> 00:00:20.700 +For Django, we have Carlton Gibson and Jeff Triplich. + +00:00:21.140 --> 00:00:23.220 +For Flask, we have David Lord and Phil Jones. + +00:00:23.860 --> 00:00:27.660 +And on Team Litestar, we have Yannick Noverde and Cody Fincher. + +00:00:28.140 --> 00:00:31.700 +And finally, Sebastian Ramirez from FastAPI is here as well. + +00:00:32.279 --> 00:00:32.860 +Let's jump in. + +00:00:33.240 --> 00:00:38.400 +This is Talk Python To Me, episode 533, recorded December 17th, 2025. + +00:00:55.900 --> 00:00:57.260 +Welcome to Talk Python To Me. + +00:00:57.380 --> 00:01:00.620 +the number one Python podcast for developers and data scientists. + +00:01:01.300 --> 00:01:02.520 +This is your host, Michael Kennedy. + +00:01:02.900 --> 00:01:06.440 +I'm a PSF fellow who's been coding for over 25 years. + +00:01:07.100 --> 00:01:08.200 +Let's connect on social media. + +00:01:08.540 --> 00:01:11.680 +You'll find me and Talk Python on Mastodon, Bluesky, and X. + +00:01:11.870 --> 00:01:13.800 +The social links are all in your show notes. + +00:01:14.560 --> 00:01:18.080 +You can find over 10 years of past episodes at talkpython.fm. + +00:01:18.260 --> 00:01:21.560 +And if you want to be part of the show, you can join our recording live streams. + +00:01:21.760 --> 00:01:22.280 +That's right. + +00:01:22.470 --> 00:01:25.740 +We live stream the raw uncut version of each episode on YouTube. + +00:01:26.140 --> 00:01:30.740 +Just visit talkpython.fm/youtube to see the schedule of upcoming events. + +00:01:30.950 --> 00:01:34.620 +Be sure to subscribe there and press the bell so you'll get notified anytime we're recording. + +00:01:35.400 --> 00:01:39.500 +Hey, before we jump into the interview, I just want to send a little message to all the companies + +00:01:39.800 --> 00:01:43.420 +out there with products and services trying to reach developers. + +00:01:44.260 --> 00:01:45.680 +That is the listeners of this show. + +00:01:46.160 --> 00:01:50.000 +As we're rolling into 2026, I have a bunch of spots open. + +00:01:50.130 --> 00:01:55.980 +So please reach out to me if you're looking to sponsor a podcast or just generally sponsor + +00:01:56.040 --> 00:01:59.820 +things in the community and you haven't necessarily considered podcasts, you really should. + +00:02:00.720 --> 00:02:04.120 +Reach out to me and I'll help you connect with the Talk Python audience. + +00:02:05.080 --> 00:02:10.220 +Thanks everyone for listening all of 2025. And here we go into 2026. Cheers. + +00:02:11.640 --> 00:02:19.140 +Hello, hello, Carlton, Sebastian, David, Cody, Yannick, Phil, Jeff, welcome back to Talk Python, + +00:02:19.300 --> 00:02:22.920 +all of you. Thanks for having us. Thank you for having us. Happy to be here again. We're here for + +00:02:22.940 --> 00:02:30.260 +what may be my favorite topic for sure. Something I spend most of my time on is web API stuff, + +00:02:30.600 --> 00:02:36.320 +which is awesome. So excited to have you here to give your inside look at how people should + +00:02:36.320 --> 00:02:41.400 +be running your framework, at least the one that you significantly contribute to, depending on + +00:02:42.040 --> 00:02:47.500 +which framework we're talking about, right? It's going to be a lot of fun, and I'm really excited + +00:02:47.500 --> 00:02:51.739 +to talk about it. However, there's an interesting fact that I've been throwing out a lot lately is + +00:02:51.760 --> 00:02:56.680 +that fully half of the people doing professional Python development have only been doing it for two + +00:02:56.820 --> 00:03:01.760 +years or less. And some of you been on the show, it was maybe two years longer than that. Let's just + +00:03:01.760 --> 00:03:06.740 +do a quick round of introductions for people who don't necessarily know you. We'll go around the + +00:03:06.880 --> 00:03:11.280 +squares here in the screen sharing. So Carlton, you're up first. Oh, I get to go first. Brilliant. + +00:03:11.520 --> 00:03:15.980 +Well, I'm Carlton. I work on the Django Red framework mostly. I'm a former Django fellow. + +00:03:16.080 --> 00:03:20.220 +I maintain a number of packages in the ecosystem. And the last few years I've been back to building + +00:03:20.240 --> 00:03:25.420 +stuff with Django rather than working on it. So I run a build startup that's, well, we're still + +00:03:25.600 --> 00:03:30.680 +going. So I'm quite excited about that. Awesome. How is it to be building with Django than building + +00:03:30.900 --> 00:03:36.060 +Django? Oh, I'm literally having the time of my life. Like I spent five years as a Django fellow + +00:03:36.200 --> 00:03:42.000 +working on Django and I just built up this backlog of things I wanted to do and I had no time and no + +00:03:42.200 --> 00:03:46.740 +capacity and no, no sort of nothing to work on with them. And it's just, it's just a delight. + +00:03:46.980 --> 00:03:50.460 +And every day I sit down on my computer thinking, oh, what's today? + +00:03:50.610 --> 00:03:51.420 +I look at the background. + +00:03:51.780 --> 00:03:52.100 +Oh, yes. + +00:03:52.680 --> 00:03:54.260 +And every day, a delight. + +00:03:54.400 --> 00:03:56.640 +So I'm still just loving it. + +00:03:56.700 --> 00:03:57.100 +That's awesome. + +00:03:57.250 --> 00:04:01.500 +So more often you're appreciating your former self than cursing your former self + +00:04:01.680 --> 00:04:02.300 +for the way you built. + +00:04:04.040 --> 00:04:05.840 +Yeah, that's an interesting one. + +00:04:05.840 --> 00:04:07.100 +I think we should move on before. + +00:04:07.960 --> 00:04:08.300 +All right. + +00:04:08.420 --> 00:04:08.620 +All right. + +00:04:09.040 --> 00:04:14.100 +Speaking of building with and for Sebastian, FastAPI. + +00:04:14.230 --> 00:04:14.340 +Hello. + +00:04:14.700 --> 00:04:14.900 +Hello. + +00:04:15.980 --> 00:04:18.100 +So, okay, intro for the ones that don't know me. + +00:04:18.100 --> 00:04:19.060 +I'm Sebastian Ramirez. + +00:04:19.579 --> 00:04:21.100 +I created FastAPI. + +00:04:21.620 --> 00:04:23.000 +Yeah, that's pretty much it. + +00:04:23.340 --> 00:04:27.780 +And now I have been building a company since the last two years, FastAPI Cloud, to deploy + +00:04:27.980 --> 00:04:28.360 +FastAPI. + +00:04:28.400 --> 00:04:33.060 +So, I get to drink from funny cups, as you can see. + +00:04:33.520 --> 00:04:34.720 +The world's best boss. + +00:04:35.220 --> 00:04:35.400 +Amazing. + +00:04:36.180 --> 00:04:39.520 +So, I think you deserve to give a bit of a shout out to FastAPI Cloud. + +00:04:39.680 --> 00:04:40.200 +That's a big deal. + +00:04:40.360 --> 00:04:40.660 +Thank you. + +00:04:40.860 --> 00:04:41.480 +Thank you very much. + +00:04:41.620 --> 00:04:42.880 +Yeah, it's super fun. + +00:04:42.900 --> 00:04:47.080 +And the idea is to make it super simple to deploy FastAPI applications. + +00:04:47.540 --> 00:04:52.120 +The idea with FastAPI was to make it very simple to build applications, build APIs, + +00:04:52.940 --> 00:04:57.700 +and get the idea from idea to product in record time. + +00:04:57.800 --> 00:04:59.060 +That was the idea with FastAPI. + +00:04:59.140 --> 00:05:02.820 +But then deploying that, in many cases, is just too cumbersome. + +00:05:02.880 --> 00:05:03.500 +It's too complicated. + +00:05:03.820 --> 00:05:05.260 +There are just so many things to that. + +00:05:05.700 --> 00:05:09.200 +So I wanted to bring something for people to be able to say, like, + +00:05:09.360 --> 00:05:12.360 +hey, just one command FastAPI deploy, and we take care of the rest. + +00:05:12.700 --> 00:05:15.660 +And then we and the team, I have an amazing thing + +00:05:15.660 --> 00:05:16.980 +that I've been able to work with. + +00:05:17.320 --> 00:05:20.200 +We suffer all the cloud pains so that people + +00:05:20.380 --> 00:05:21.400 +don't have to deal with that. + +00:05:21.520 --> 00:05:25.840 +And yeah, it's painful to build, but it's so cool to use it. + +00:05:25.860 --> 00:05:27.780 +You know, like that's the part when I say like, yes, + +00:05:28.220 --> 00:05:29.400 +this was worth it. + +00:05:29.480 --> 00:05:32.240 +When I get to use the thing myself, that is super cool. + +00:05:32.240 --> 00:05:34.240 +MARK MANDEL: Yeah, I'm assuming you build FastAPI + +00:05:34.400 --> 00:05:35.560 +Cloud with FastAPI somewhat. + +00:05:35.720 --> 00:05:37.440 +FRANCISCO MOLIN: Yes, yes, yes, exactly. + +00:05:37.820 --> 00:05:39.720 +FastAPI Cloud runs on FastAPI Cloud. + +00:05:40.420 --> 00:05:44.200 +And I get just like now random things in there and like, yes. + +00:05:44.480 --> 00:05:45.260 +Congrats to that again. + +00:05:45.440 --> 00:05:45.980 +That's super cool. + +00:05:46.520 --> 00:05:47.220 +David Lord, welcome. + +00:05:47.600 --> 00:05:47.880 +Welcome back. + +00:05:48.040 --> 00:05:48.260 +Yeah. + +00:05:48.460 --> 00:05:48.580 +Hello. + +00:05:49.080 --> 00:05:49.640 +I'm David Lord. + +00:05:49.960 --> 00:05:54.820 +I'm the lead maintainer of Pallets, which is Flask, Jinja, Click, Berkswag. + +00:05:55.020 --> 00:05:56.020 +It's dangerous, markup safe. + +00:05:56.560 --> 00:06:02.540 +And now Pallets Eco, which is a bunch of the famous extensions for Flask that are getting + +00:06:02.800 --> 00:06:03.720 +community maintenance now. + +00:06:04.100 --> 00:06:08.200 +I've been doing that since, I think I've been the lead maintainer since like 2019, but a + +00:06:08.400 --> 00:06:09.460 +maintainer since like 2017. + +00:06:09.920 --> 00:06:10.560 +So it's been a while. + +00:06:10.660 --> 00:06:11.380 +That's been a while. + +00:06:11.560 --> 00:06:13.860 +We're coming up on seven, eight years. + +00:06:14.160 --> 00:06:14.520 +That's crazy. + +00:06:15.200 --> 00:06:15.540 +Time flies. + +00:06:15.580 --> 00:06:18.280 +It's always funny because I always feel like I've been doing it for way, way longer. + +00:06:18.600 --> 00:06:21.340 +And then I look at the actual date that I got added as a maintainer. + +00:06:21.340 --> 00:06:22.660 +I'm like, well, it couldn't have been that late. + +00:06:22.800 --> 00:06:23.860 +I was doing stuff before that, right? + +00:06:24.360 --> 00:06:27.720 +Well, I'm sure you were deep in flask before you got added as a maintainer of it, right? + +00:06:28.140 --> 00:06:28.200 +Yeah. + +00:06:28.320 --> 00:06:31.780 +Phil Jones, since you are also on the same org, next. + +00:06:32.240 --> 00:06:32.900 +Hey, welcome back. + +00:06:32.980 --> 00:06:33.080 +Hello. + +00:06:33.440 --> 00:06:34.500 +Yeah, I'm Phil Jones. + +00:06:34.720 --> 00:06:37.760 +I am the author of Quartz, which is also part of Palette. + +00:06:37.960 --> 00:06:41.800 +I also work on Berkshagen and Flask and help out there. + +00:06:42.220 --> 00:06:44.660 +And I've done a server called Hypercorn as well. + +00:06:44.820 --> 00:06:47.680 +So a bit of interest in that part of the ecosystem. + +00:06:47.860 --> 00:06:49.300 +What is Quart for people who don't know? + +00:06:50.000 --> 00:06:52.840 +Quart is basically Flask with async await. + +00:06:53.340 --> 00:06:56.880 +And that was the idea behind it really to make it possible to do async await. + +00:06:57.240 --> 00:06:58.800 +So yeah, that's pretty much it. + +00:06:58.800 --> 00:07:00.820 +If we, when we manage to merge them, we will. + +00:07:00.920 --> 00:07:07.240 +And the goal now with Quart as part of palettes is to eventually have it be one code base with Flask. + +00:07:07.600 --> 00:07:11.760 +But given that we both have small children now, we're moving a lot slower. + +00:07:13.060 --> 00:07:13.900 +Having kids is great. + +00:07:14.040 --> 00:07:14.820 +I have three kids. + +00:07:15.480 --> 00:07:19.920 +Productivity is not a thing that they are known to imbue on the parents, right? + +00:07:20.060 --> 00:07:21.320 +Especially in the early days. + +00:07:21.780 --> 00:07:23.080 +I want to say, Phil, thank you. + +00:07:23.200 --> 00:07:26.820 +I've been running Quart for a couple of my websites lately, and it's been amazing. + +00:07:26.980 --> 00:07:27.260 +Nice. + +00:07:27.580 --> 00:07:29.080 +Yeah, I also use it at work. + +00:07:29.200 --> 00:07:31.840 +We've got all our stuff in Quart, which is, yeah, it's really good fun. + +00:07:31.960 --> 00:07:32.660 +A bit like Carlton. + +00:07:32.800 --> 00:07:39.240 +So when people, if they get, if they listen to the show or they go to the website of the show and they're not on YouTube, then that somehow involves court. + +00:07:39.640 --> 00:07:40.160 +Janek, welcome. + +00:07:40.560 --> 00:07:40.700 +Hey. + +00:07:41.160 --> 00:07:42.300 +Yeah, I'm Janek de Vietni. + +00:07:42.400 --> 00:07:44.260 +I work on Litestar. + +00:07:44.360 --> 00:07:47.300 +I just looked up how long it's been because I was curious myself. + +00:07:48.020 --> 00:07:52.780 +I also had the same feeling that it's been a lot longer than, it's actually only been three years. + +00:07:53.380 --> 00:07:53.460 +Yeah. + +00:07:53.820 --> 00:07:57.060 +And I also, I noticed something with all you guys here in the room. + +00:07:57.200 --> 00:08:00.580 +I use almost all of the projects you maintain at work, + +00:08:01.960 --> 00:08:04.360 +which is quite nice. + +00:08:04.420 --> 00:08:06.660 +We have a very big Django deployment. + +00:08:06.840 --> 00:08:08.180 +We have some Flask deployments. + +00:08:08.180 --> 00:08:09.760 +We have a few FastAPI deployments. + +00:08:10.100 --> 00:08:12.360 +I think we have one core deployment + +00:08:12.700 --> 00:08:14.560 +and we also have two Light Store deployments, + +00:08:15.280 --> 00:08:17.620 +which obviously is a lot of fun to work with. + +00:08:17.800 --> 00:08:20.420 +And I find it really, really nice actually + +00:08:20.560 --> 00:08:23.240 +to work with all these different things. + +00:08:23.400 --> 00:08:26.039 +It's super interesting also because like everything + +00:08:26.060 --> 00:08:29.540 +has its own niche that it's really good at. + +00:08:29.760 --> 00:08:31.600 +And even, you know, you think + +00:08:31.910 --> 00:08:33.260 +if you maintain a framework yourself, + +00:08:33.539 --> 00:08:36.099 +you tend to always recommend it for everything. + +00:08:36.440 --> 00:08:38.440 +But I noticed it's not actually true. + +00:08:38.640 --> 00:08:40.360 +There's actually quite a few cases + +00:08:40.640 --> 00:08:42.479 +where I don't recommend Litestar. + +00:08:42.580 --> 00:08:44.480 +I recommend, you know, just, you know, + +00:08:44.680 --> 00:08:46.740 +use Django for this or, you know, + +00:08:47.120 --> 00:08:49.340 +use Flask for that or use FastAPI for this + +00:08:49.430 --> 00:08:52.200 +because, well, they are quite different after all. + +00:08:52.340 --> 00:08:54.999 +And I find that really, really interesting + +00:08:55.020 --> 00:09:00.480 +and nice. And I think it's a good sign of a healthy ecosystem if it's not just, you know, + +00:09:00.740 --> 00:09:04.560 +the same thing, but different, but it actually brings something very unique and different to + +00:09:04.560 --> 00:09:08.760 +the table. I think that's a great attitude. And it's very interesting. You know, I feel like + +00:09:08.880 --> 00:09:13.280 +there's a lot of people who feel like they've kind of got to pick their tech team for everything. + +00:09:13.820 --> 00:09:17.260 +I'm going to build a static site. Like, well, I've got to have a Python-based static site builder. + +00:09:17.300 --> 00:09:21.220 +Like, well, it's a static site. Who cares what technology makes it turn? You're writing Markdown, + +00:09:21.260 --> 00:09:22.080 +and out comes HTML. + +00:09:22.350 --> 00:09:24.000 +Who cares what's in the middle, for example, right? + +00:09:24.540 --> 00:09:27.600 +And, you know, I feel like that's kind of a life lessons learned. + +00:09:28.060 --> 00:09:28.980 +Absolutely, yeah. + +00:09:29.140 --> 00:09:29.760 +Yeah, that's awesome. + +00:09:30.290 --> 00:09:31.140 +Cody, hello, hello. + +00:09:31.340 --> 00:09:32.780 +Yeah, hey guys, I'm Cody Fincher. + +00:09:32.900 --> 00:09:34.560 +I'm also one of the maintainers of Litestar. + +00:09:34.840 --> 00:09:37.400 +I've been there just a little bit longer than Yannick. + +00:09:37.940 --> 00:09:39.700 +And so it's been about four years now. + +00:09:40.080 --> 00:09:41.880 +And Yannick actually teed this up perfectly + +00:09:42.000 --> 00:09:43.380 +because I was going to say something very similar. + +00:09:43.500 --> 00:09:44.820 +I currently work for Google. + +00:09:44.960 --> 00:09:46.460 +I've been there for about three and a half years now. + +00:09:46.760 --> 00:09:49.240 +And we literally have every one of the frameworks + +00:09:49.370 --> 00:09:50.220 +you guys just mentioned, + +00:09:50.240 --> 00:09:51.240 +and they're all in production. + +00:09:51.480 --> 00:09:54.080 +And so one of the things that you'll see + +00:09:54.380 --> 00:09:56.420 +on the Litestar org and part of the projects + +00:09:56.440 --> 00:09:59.040 +that we maintain are that we have these optional batteries + +00:09:59.320 --> 00:10:01.060 +and most of the batteries that we have + +00:10:01.260 --> 00:10:03.120 +all work with the frameworks for you guys. + +00:10:03.220 --> 00:10:05.800 +And so it's nice to be able to use that stuff, + +00:10:06.080 --> 00:10:08.580 +you know, regardless of what tooling you've got + +00:10:08.720 --> 00:10:10.120 +or what project it is. + +00:10:10.180 --> 00:10:12.140 +And so, yeah, having that interoperability + +00:10:12.400 --> 00:10:14.120 +and the ability to kind of go between the frameworks + +00:10:14.380 --> 00:10:16.800 +that work the best for the right situation is crucial. + +00:10:16.920 --> 00:10:18.500 +And so I'm glad you mentioned that, Yannick. + +00:10:18.520 --> 00:10:20.200 +But yeah, nice to see you guys + +00:10:20.220 --> 00:10:25.140 +on the show. Cody, tell people what Litestar is. I know I had both you guys and Jacob on a while + +00:10:25.260 --> 00:10:29.900 +ago, but it's been a couple of years, I think. Litestar at its core is really a web framework + +00:10:30.010 --> 00:10:36.260 +that kind of sits somewhere in between, I'd say, Flask and FastAPI and Django. So whereas, you know, + +00:10:36.300 --> 00:10:41.040 +Flask doesn't really, you know, bundle a lot of batteries. There's a huge amount of, you know, + +00:10:41.180 --> 00:10:44.540 +third-party libraries and ecosystem that's built around it that people can add into it, but there's + +00:10:44.560 --> 00:10:50.000 +not really like, for instance, a database adapter or a database plugin or plugins for + +00:10:50.180 --> 00:10:54.100 +VEAT or something like that, right, for front end development. And so what we have been doing + +00:10:54.130 --> 00:10:59.820 +is building a API framework that is very similar in concept to FastAPI that is also extensible. + +00:10:59.930 --> 00:11:03.280 +So if you want to use the batteries, they're there for you. But if you don't want to use + +00:11:03.380 --> 00:11:07.560 +them, you don't have to, right? And so a lot of the tooling that we built for LightStore + +00:11:07.600 --> 00:11:12.920 +was birthed out of a startup that I was in prior to joining Google. And so having all + +00:11:12.940 --> 00:11:15.160 +this boilerplate, really, it needed somewhere to go. + +00:11:15.320 --> 00:11:17.620 +And so a lot of this stuff ended up being plugins, + +00:11:17.960 --> 00:11:19.800 +which is what we bundled into Litestar + +00:11:19.940 --> 00:11:22.460 +so that you can kind of add in this extra functionality. + +00:11:22.960 --> 00:11:24.360 +And so I know I'm getting long-winded. + +00:11:24.480 --> 00:11:27.340 +It's somewhere between Django and Flask, + +00:11:27.400 --> 00:11:29.280 +if you were to think about it in terms of a spectrum, + +00:11:29.340 --> 00:11:32.520 +in terms of what it gives you in terms of a web framework. + +00:11:32.820 --> 00:11:34.960 +But in short, it does everything that all the other guys do. + +00:11:35.120 --> 00:11:37.520 +Very neat. It's definitely a framework I admire. + +00:11:37.820 --> 00:11:39.760 +Jeff Triplett, so glad you could make it. + +00:11:39.860 --> 00:11:40.720 +Yeah, thanks for having me. + +00:11:40.860 --> 00:11:42.560 +Yeah, I'm Jeff Triplett. I'm out of Lawrence, Kansas. + +00:11:42.840 --> 00:11:45.100 +I'm a consultant at a company called Revolution Systems. + +00:11:45.550 --> 00:11:48.320 +I was on, some people know me from being on the Python Software Foundation board. + +00:11:48.680 --> 00:11:49.920 +I've been off that for a few years. + +00:11:50.240 --> 00:11:52.880 +As of last week, I'm the president of the Django Software Foundation. + +00:11:53.290 --> 00:11:54.620 +So I've been on that board for a year. + +00:11:54.860 --> 00:11:56.620 +I'm kind of a Django power user, I guess. + +00:11:56.720 --> 00:11:57.900 +I've used it for about 20 years. + +00:11:58.420 --> 00:12:02.040 +And I've kind of not really worked on, I don't even think I have a patch anymore in Django. + +00:12:02.460 --> 00:12:04.280 +But I've done a lot with the community. + +00:12:04.720 --> 00:12:08.660 +I've done a lot with contributing through conferences and using utilities. + +00:12:09.260 --> 00:12:12.180 +I try to promote Carleton's applications like Neapolitan. + +00:12:12.260 --> 00:12:15.780 +And if I like tools, Python tools in general, I try to advocate for it. + +00:12:16.040 --> 00:12:18.620 +I've also used all of these applications. + +00:12:18.900 --> 00:12:21.020 +Litestar, I haven't, but I have a friend who talks about it a lot. + +00:12:21.320 --> 00:12:22.860 +And so I feel like I know a lot from it. + +00:12:23.100 --> 00:12:25.440 +As a consultant, we tend to go with the best tool for the job. + +00:12:25.600 --> 00:12:26.840 +So I've done a little bit of FastAPI. + +00:12:27.340 --> 00:12:31.020 +I worked with Flask a lot over the years, even though we're primarily a Django shop. + +00:12:31.360 --> 00:12:32.500 +It just depends on what the client needs. + +00:12:32.580 --> 00:12:35.940 +And you see a lot of different sizes of web app deployments. + +00:12:36.100 --> 00:12:38.140 +So I think that's going to be an interesting angle for sure. + +00:12:38.320 --> 00:12:38.860 +Yeah, absolutely. + +00:12:39.220 --> 00:12:41.560 +Small ones to hundreds of servers. + +00:12:42.100 --> 00:12:46.400 +We don't see it as much anymore the last four or five years, especially with like CDNs and caching. + +00:12:46.800 --> 00:12:49.720 +We just don't see load like we did, you know, 10 years ago or so. + +00:12:50.060 --> 00:12:55.160 +And then I also do a lot of like small, I kind of call them some of them little dumb projects, but some are just fun. + +00:12:55.480 --> 00:12:59.800 +Like I've got a FastAPI web ring that I wrote a year ago for April Fool's Day. + +00:13:00.100 --> 00:13:03.440 +And for some reason that kind of took off and people liked it, even though it was kind of a joke. + +00:13:03.780 --> 00:13:07.620 +So I started like peppering it on a bunch of sites and I maintain like Django packages. + +00:13:08.000 --> 00:13:11.020 +I do a newsletter, Django News newsletter, just kind of lots of fun stuff. + +00:13:11.340 --> 00:13:13.360 +Definitely looking forward to hearing all of your opinions. + +00:13:14.000 --> 00:13:17.460 +So I've got a bunch of different your app in production topics + +00:13:17.580 --> 00:13:20.560 +I thought we could just work around or talk over. + +00:13:20.900 --> 00:13:24.180 +So I thought maybe the first one is what would you recommend, + +00:13:24.540 --> 00:13:26.340 +or if you don't really have a strong recommendation, + +00:13:26.620 --> 00:13:31.900 +what would you choose for yourself to put your app in your framework in production? + +00:13:32.080 --> 00:13:35.820 +I'm thinking app servers, reverse proxies like Nginx or Caddy. + +00:13:36.360 --> 00:13:37.320 +Do you go for threaded? + +00:13:37.860 --> 00:13:39.180 +Try to scale out with threads. + +00:13:39.340 --> 00:13:44.820 +you try to scale out with processes, Docker, no Docker, Kubernetes. What are we doing here, + +00:13:44.940 --> 00:13:49.560 +folks? Carlton. I think we'll just keep going around the circle here. So you may get the first + +00:13:49.880 --> 00:13:52.280 +round of everyone. No, I'll try to mix it up, but let's do it this time. + +00:13:52.360 --> 00:14:00.020 +I do the oldest school thing in the book. I run Nginx as my front end. I'll stick a + +00:14:00.100 --> 00:14:05.839 +WSGI server behind it with a pre-fork, a few workers, depending on CPU size, depending on + +00:14:05.860 --> 00:14:10.900 +the kind of requests I'm handling. These days, in order to handle long-lived requests, + +00:14:10.960 --> 00:14:14.860 +like server-sent events, that kind of, or WebSocket type things, I'll run an ASGII server as a kind + +00:14:14.860 --> 00:14:18.040 +of sidecar. I've been thinking about this a lot, actually. But yeah, this is interesting. + +00:14:18.280 --> 00:14:22.120 +If you're running a small site and you want long-lived requests, just run ASGII. Just use + +00:14:22.240 --> 00:14:29.360 +ASGII. Because any of the servers, Hypercorn, Uvacorn, Daphne, Grannion is the new hot kid on + +00:14:29.360 --> 00:14:34.279 +the bot, right? All of those will handle your traffic, no problem. But for me, the scaling + +00:14:34.300 --> 00:14:39.720 +paddles and whiskey are so well known and just like i can do the maths on the back of the pencil i know + +00:14:39.900 --> 00:14:45.740 +exactly how to scale it up having done it for so long for me for my core application i would still + +00:14:45.880 --> 00:14:51.080 +rather use the whiskey server and then limit the async stuff to just to the use cases where it's + +00:14:51.180 --> 00:14:58.040 +particularly suited so i'll do that um process manager i deploy using systemd if i want to if i + +00:14:58.120 --> 00:15:03.719 +want a container i'll use podman by systemd it's as old school as it gets i'll very often run a + +00:15:03.740 --> 00:15:05.740 +a Redis instance on localhost for caching, + +00:15:05.960 --> 00:15:07.780 +and that will be it. + +00:15:08.100 --> 00:15:09.300 +And that will get me an awful long way. + +00:15:09.340 --> 00:15:10.020 +If I have to schedule, + +00:15:10.880 --> 00:15:11.940 +I just get a bigger box. + +00:15:12.260 --> 00:15:12.960 +And a bigger box. + +00:15:13.020 --> 00:15:13.460 +Yeah, yeah, yeah. + +00:15:13.560 --> 00:15:16.040 +I really, really, really need multiple boxes. + +00:15:16.260 --> 00:15:16.720 +Well, then we'll talk. + +00:15:16.820 --> 00:15:18.400 +I feel like you and I are in a similar vibe. + +00:15:18.840 --> 00:15:20.680 +But one thing I want to sort of throw out there to you, + +00:15:20.760 --> 00:15:22.360 +but also sort of the others is, + +00:15:22.720 --> 00:15:23.780 +what are we talking with databases? + +00:15:24.300 --> 00:15:26.460 +Like, who is bold enough to go SQLite? + +00:15:26.740 --> 00:15:28.100 +Anyone's going SQLite out there? + +00:15:28.900 --> 00:15:30.020 +Yeah, it depends, right? + +00:15:30.100 --> 00:15:31.580 +It just depends on what you're doing, right? + +00:15:31.580 --> 00:15:33.460 +And how many concurrent users you're going to have. + +00:15:33.600 --> 00:15:34.640 +It really is amazing there. + +00:15:34.800 --> 00:15:36.860 +The Palette's website is running on Flask, + +00:15:37.160 --> 00:15:38.080 +which I wasn't doing for a while. + +00:15:38.080 --> 00:15:39.400 +I was doing a static site generator. + +00:15:39.770 --> 00:15:43.340 +Then I got inspired by Andrew Godwin's static dynamic sites. + +00:15:43.860 --> 00:15:46.520 +And so it loads up all these markdown files, + +00:15:46.730 --> 00:15:49.900 +static markdown files into a SQLite database at runtime + +00:15:50.480 --> 00:15:51.720 +and then serves off of that + +00:15:51.750 --> 00:15:52.920 +because you can query really fast. + +00:15:53.100 --> 00:15:54.160 +Oh, that's awesome. I love it. + +00:15:54.260 --> 00:15:56.140 +So I am using SQLite for the Palette's website. + +00:15:56.480 --> 00:16:00.220 +Yeah, I also do have a few small apps that use SQLite. + +00:16:00.340 --> 00:16:02.980 +And one recently that's Cody's fault + +00:16:03.000 --> 00:16:04.580 +because he put me on that track + +00:16:05.240 --> 00:16:08.740 +where it's running a SQLite database in the browser + +00:16:08.960 --> 00:16:12.300 +because nowadays it's quite easy to do that. + +00:16:12.860 --> 00:16:15.200 +And then you can do all sorts of stuff with it, + +00:16:15.300 --> 00:16:19.080 +like hook into it with DuckDB and perform some analysis. + +00:16:19.320 --> 00:16:23.020 +So you don't actually need to run any sort of server at all. + +00:16:23.060 --> 00:16:26.780 +You can just throw some files into Nginx and serve your data. + +00:16:26.980 --> 00:16:28.240 +And as long as that's static, + +00:16:28.300 --> 00:16:30.160 +you have a super, super simple deployment. + +00:16:30.680 --> 00:16:32.200 +So yeah, definitely SQLite. + +00:16:32.580 --> 00:16:34.280 +If you can, I like it. + +00:16:34.360 --> 00:16:35.000 +I agree. + +00:16:35.100 --> 00:16:35.460 +It's interesting. + +00:16:36.080 --> 00:16:38.100 +The database probably won't go down with that, probably. + +00:16:38.500 --> 00:16:40.020 +Let's do this by framework. + +00:16:40.180 --> 00:16:42.340 +So we'll do vertical slices in the visual here. + +00:16:42.600 --> 00:16:43.140 +So Jeff. + +00:16:43.560 --> 00:16:46.000 +Yeah, Django, Postgres, pretty old school stack. + +00:16:46.320 --> 00:16:49.120 +I think putting a CD in front of anything is just a win. + +00:16:49.210 --> 00:16:51.000 +So whether you like Fastly or Cloudflare, + +00:16:51.150 --> 00:16:52.380 +you get a lot of mileage out of it. + +00:16:52.660 --> 00:16:53.760 +You learn a lot about caching + +00:16:53.920 --> 00:16:55.880 +because it's kind of hard to cache Django by default. + +00:16:56.200 --> 00:16:57.200 +So you get to play with curl + +00:16:57.430 --> 00:16:59.260 +and kind of figure out why very headers are there. + +00:16:59.670 --> 00:17:02.180 +And it's a good learning experience to get through that. + +00:17:02.240 --> 00:17:06.600 +I also like Coolify, which is kind of new, at least new to me and new to Michael. + +00:17:06.689 --> 00:17:08.240 +We talk about this in our spare time a lot. + +00:17:08.520 --> 00:17:11.800 +It's just kind of a boring service that'll launch a bunch of containers for you. + +00:17:12.280 --> 00:17:15.040 +There's a bunch of one-click installs, so Postgres is a one-click. + +00:17:15.270 --> 00:17:18.240 +It also does backups for you, which is really nice to have for free. + +00:17:18.520 --> 00:17:21.640 +I run a couple dozen sites through it and really like it. + +00:17:21.900 --> 00:17:26.140 +You can either do a hosted forum, I don't get any money from it, or you can run the open-source version. + +00:17:26.560 --> 00:17:27.140 +I do both. + +00:17:27.310 --> 00:17:30.500 +I've got like a home lab that I just run stuff using the open-source version. + +00:17:30.520 --> 00:17:32.860 +And for five bucks a month, it's worth it to run a couple servers. + +00:17:33.300 --> 00:17:35.400 +And like Carlton said, you can just scale up. + +00:17:35.880 --> 00:17:40.080 +Yeah, it's got a bunch of one-click deploy for self-hosted SaaS things as well. + +00:17:40.260 --> 00:17:44.860 +Like I want an analytics stack of containers that run in its own isolated bit. + +00:17:44.880 --> 00:17:45.960 +Just click here and go. + +00:17:46.280 --> 00:17:48.520 +Yeah, one-click, it's installed and you're up. + +00:17:48.740 --> 00:17:52.760 +And then once you get one Django, Flask, FastAPI site working with it, + +00:17:53.020 --> 00:17:54.440 +and it uses like a Docker container. + +00:17:54.780 --> 00:17:58.500 +Once you get that set up, it's really easy to just kind of duplicate that site, + +00:17:58.960 --> 00:18:01.160 +plug it in to GitHub or whatever your Git provider is. + +00:18:01.260 --> 00:18:05.460 +And it's a nice experience for what normally is just our syncing files + +00:18:05.820 --> 00:18:06.860 +and life's too short for that. + +00:18:06.960 --> 00:18:09.500 +Sebastian, I want to have you go last on this one + +00:18:09.500 --> 00:18:12.160 +because I think you've got something pretty interesting + +00:18:12.280 --> 00:18:14.260 +with FastAPI Cloud to dive into. + +00:18:14.480 --> 00:18:16.520 +But let's do Litestar next. Cody. + +00:18:16.840 --> 00:18:19.580 +I have actually bought all the way in on Gradient. + +00:18:19.840 --> 00:18:22.420 +So for the ASCII server, I've actually been running Gradient now + +00:18:22.560 --> 00:18:24.060 +for I'd say a year in production. + +00:18:24.420 --> 00:18:25.660 +It's worked pretty well. + +00:18:26.180 --> 00:18:28.920 +There's a couple of new things that I'm actually kind of + +00:18:28.940 --> 00:18:31.460 +with. I don't know how well they're going to work out. So I'm going to go ahead and throw this out + +00:18:31.540 --> 00:18:37.900 +there. But Granian is one of the few ASCII servers that supports HTTP2. And it actually can do HTTP2 + +00:18:38.040 --> 00:18:42.100 +clear text. And so this is part of the next thing I'm going to say. Because I work for Google, I'm + +00:18:42.480 --> 00:18:47.440 +actively using lots of Kubernetes and Cloud Run mainly. And so most of the things that I deploy + +00:18:47.520 --> 00:18:52.860 +are containerized on Cloud Run. And I typically would suggest if you're not using something like + +00:18:53.100 --> 00:18:57.140 +SystemD and deploying it directly on bare metal, then you are going to want to let the container + +00:18:57.160 --> 00:18:59.740 +or whatever you're using to manage your processes, + +00:19:00.340 --> 00:19:01.640 +manage that and spin that up. + +00:19:01.670 --> 00:19:03.900 +And so I typically try to allocate, + +00:19:04.070 --> 00:19:05.780 +you know, like one CPU for the container + +00:19:05.920 --> 00:19:08.660 +and let the actual framework scale it up and down as needed. + +00:19:09.080 --> 00:19:11.600 +Cloud Run itself has a, like an ingress, + +00:19:11.830 --> 00:19:13.320 +like a load balancer that sits in front + +00:19:13.480 --> 00:19:14.480 +that it automatically configures. + +00:19:14.510 --> 00:19:16.980 +And you're required to basically serve up + +00:19:17.160 --> 00:19:19.300 +clear text traffic in when you run Cloud Run. + +00:19:19.740 --> 00:19:22.580 +And because now Gradient supports HTTP2 + +00:19:22.820 --> 00:19:25.340 +and Cloud Run supports HTTP2 clear text, + +00:19:25.500 --> 00:19:28.140 +you can now serve Granian as HTTP2 traffic. + +00:19:28.520 --> 00:19:30.960 +The good thing about that is that you get an unlimited upload size. + +00:19:31.060 --> 00:19:33.380 +And so there are max thresholds to what you can upload + +00:19:33.660 --> 00:19:34.720 +into the various cloud environments. + +00:19:35.500 --> 00:19:37.820 +HTTP2 usually circumvents that or gets around it + +00:19:37.900 --> 00:19:39.120 +because of the way the protocol works. + +00:19:39.200 --> 00:19:42.020 +And so you get additional features and functionality because of that. + +00:19:42.200 --> 00:19:43.780 +So anyway, that's what I typically do. + +00:19:44.000 --> 00:19:46.280 +And most of my databases are usually Postgres, + +00:19:47.020 --> 00:19:50.180 +AlloyDB if it needs to be something that's on the analytical side. + +00:19:50.600 --> 00:19:52.160 +Yeah, I'm on Team Granian as well. + +00:19:52.160 --> 00:19:53.600 +I think that's a super neat framework. + +00:19:53.840 --> 00:19:56.660 +I had Giovanni on who's behind it a while ago. + +00:19:57.050 --> 00:19:59.820 +It seems like it's not as popular, + +00:20:00.220 --> 00:20:02.000 +but it's based on Hyper from the Rust world, + +00:20:02.220 --> 00:20:05.500 +which has like 130,000 projects based on it or something. + +00:20:05.530 --> 00:20:09.420 +So, you know, at its core, it's still pretty battle-tested. + +00:20:11.420 --> 00:20:16.140 +this portion of talk python to me is brought to you by our course just enough python for data + +00:20:16.380 --> 00:20:21.160 +scientists if you live in notebooks but need your work to hold up in the real world check out just + +00:20:21.220 --> 00:20:26.580 +enough python for data scientists it's a focused code first course that tightens the python you + +00:20:26.840 --> 00:20:33.080 +actually use and adds the habits that make results repeatable we refactor messy cells into functions + +00:20:33.100 --> 00:20:38.700 +and packages, use Git on easy mode, lock environments with uv, and even ship with Docker. + +00:20:39.380 --> 00:20:42.020 +Keep your notebook speed, add engineering reliability. + +00:20:42.720 --> 00:20:43.900 +Find it at Talk Python Training. + +00:20:44.140 --> 00:20:46.900 +Just click courses in the navbar at talkpython.fm. + +00:20:47.959 --> 00:20:49.260 +Yannick, how about you? + +00:20:49.560 --> 00:20:50.800 +You've got a variety, it sounds like. + +00:20:50.860 --> 00:20:51.680 +Yeah, definitely. + +00:20:53.000 --> 00:20:59.340 +There's a pretty clear split between what I do at work and what I do outside of that. + +00:20:59.440 --> 00:21:01.720 +So at work, it's Kubernetes deployments. + +00:21:01.900 --> 00:21:05.360 +And we managed that pretty much the same way that Cody described. + +00:21:05.540 --> 00:21:09.700 +So it's one or two processes per pod max. + +00:21:10.060 --> 00:21:14.660 +So you can have Kubernetes scaled or even manually easily scale that up. + +00:21:14.760 --> 00:21:19.540 +You can just go into Kubernetes and say, OK, do me one to five more pods or whatever. + +00:21:20.060 --> 00:21:20.680 +And don't have to worry. + +00:21:21.040 --> 00:21:22.620 +Don't have to start calculating whatever. + +00:21:23.280 --> 00:21:28.580 +Most of the stuff we run nowadays with uv corn or Django deployment + +00:21:28.900 --> 00:21:34.080 +up until I think three months ago or so was running under the Unicorn, + +00:21:34.160 --> 00:21:35.240 +but we switched that actually. + +00:21:35.760 --> 00:21:37.800 +And it's been a really great experience. + +00:21:38.380 --> 00:21:42.880 +I think we tried that a year ago and it didn't work out quite so well. + +00:21:42.900 --> 00:21:47.440 +There was some things that didn't work as expected or didn't perform great + +00:21:47.700 --> 00:21:52.220 +or Django was throwing some errors or Uvicorn was throwing some errors. + +00:21:52.480 --> 00:21:55.040 +And then apparently all of that got fixed + +00:21:55.360 --> 00:21:58.380 +because now it runs without any issue for the production. + +00:21:58.560 --> 00:22:06.400 +Yeah, for people who don't know, the vibe used to be run G Unicorn, but then with UVicorn workers, if you're doing async stuff. + +00:22:06.510 --> 00:22:12.620 +And then UVicorn kind of stepped up its game and said, you can actually treat us as our own app server. + +00:22:12.940 --> 00:22:14.580 +We'll manage lifecycle and stuff. + +00:22:14.590 --> 00:22:16.120 +And so that's the path you took, right? + +00:22:16.240 --> 00:22:16.740 +Yeah, exactly. + +00:22:16.970 --> 00:22:17.340 +Before that. + +00:22:17.630 --> 00:22:21.980 +Well, no, actually, before that, we didn't because our Django is fully synchronous. + +00:22:22.170 --> 00:22:24.020 +It doesn't do any async. + +00:22:24.220 --> 00:22:26.820 +So it was just bare metal G Unicorn. + +00:22:26.960 --> 00:22:30.100 +And it's still synchronous with just running it under UVcorn. + +00:22:30.540 --> 00:22:34.200 +But interestingly, still quite a bit faster in a few cases. + +00:22:34.800 --> 00:22:38.520 +We tried that out and we low tested it in a couple of scenarios + +00:22:38.800 --> 00:22:41.100 +and we found that it makes a lot of sense. + +00:22:41.560 --> 00:22:44.200 +But outside of that, I do have a lot of, well, + +00:22:44.780 --> 00:22:48.100 +very simplistic deployments that are also just systemd + +00:22:48.380 --> 00:22:53.339 +and a couple of Docker compose files and containers + +00:22:53.360 --> 00:22:58.620 +that are managed through some old coupled together Ansible things. + +00:22:59.160 --> 00:23:02.620 +But I think the oldest one that I have still running is from 2017. + +00:23:03.500 --> 00:23:07.040 +And it's been running without a change for like four or five years. + +00:23:07.220 --> 00:23:07.920 +That is awesome. + +00:23:08.180 --> 00:23:11.680 +I don't see a reason to do anything about it because the app works. + +00:23:11.920 --> 00:23:13.500 +It's being used productively. + +00:23:14.100 --> 00:23:16.060 +So why change anything about that? + +00:23:16.060 --> 00:23:17.000 +No need to introduce. + +00:23:17.380 --> 00:23:18.180 +Just don't touch it. + +00:23:18.560 --> 00:23:22.340 +Yeah, I was actually looking into Coolify that you two guys mentioned. + +00:23:22.760 --> 00:23:28.040 +I was thinking about, you know, maybe upgrading it to that, but I played around with it and I thought, well, why? + +00:23:28.220 --> 00:23:31.520 +You know, if I have to look into that deployment maybe once a year. + +00:23:31.960 --> 00:23:36.080 +So that's really nothing to gain for me to make it more complicated. + +00:23:36.560 --> 00:23:37.960 +David, Team Flask. + +00:23:38.320 --> 00:23:45.000 +I mentioned this before the show started, but I think I'm pretty sure I've said this the last time I was on Talk Python, + +00:23:45.400 --> 00:23:50.760 +but the projects I do for work typically have less than 100 users. + +00:23:51.700 --> 00:23:54.500 +And so my deployment is usually really simple. + +00:23:54.680 --> 00:23:57.780 +And usually they've chosen like Azure or AWS already. + +00:23:58.180 --> 00:24:00.020 +So we just have a Docker container + +00:24:00.360 --> 00:24:03.020 +and we put it on the relevant Docker container host + +00:24:03.080 --> 00:24:05.480 +in that service and it just works for them. + +00:24:05.740 --> 00:24:08.400 +We have a Postgres database and we have like Redis. + +00:24:08.880 --> 00:24:11.280 +But I never really had to deal with like scaling + +00:24:11.820 --> 00:24:13.380 +or that sort of stuff. + +00:24:13.740 --> 00:24:16.780 +But the funny thing is like, at least for my work, + +00:24:16.940 --> 00:24:19.360 +I'm always, we're often replacing older systems. + +00:24:19.840 --> 00:24:32.620 +And so even a single Docker container running a Flask application is way more performant and responsive than anything they're used to from like some 20 year old or 30 year old Java system. + +00:24:32.880 --> 00:24:38.420 +Right. And it can just respond on a small container with like a little bit of CPU and a little bit of memory. + +00:24:38.740 --> 00:24:41.380 +They're always shocked at like, how much do we need to pay for? + +00:24:41.520 --> 00:24:43.280 +Oh, just like it'll run a potato. + +00:24:44.160 --> 00:24:46.680 +You know, there's only 100 users and they're like, that's a lot of users. + +00:24:48.400 --> 00:24:52.740 +So my recommendation is always start small and then scale up from there. + +00:24:52.900 --> 00:24:54.980 +Don't try to overthink it ahead of time. + +00:24:55.440 --> 00:24:58.860 +Yeah, for my personal stuff, I'm using like Docker containers now and fly.io. + +00:24:59.070 --> 00:25:00.160 +I haven't gotten in. + +00:25:00.550 --> 00:25:04.420 +So I do want to look into Granian and Coolify, but I haven't gotten there yet. + +00:25:04.920 --> 00:25:09.160 +And for the Docker container, I can definitely recommend pythonspeed.org. + +00:25:09.480 --> 00:25:13.260 +I don't remember off the top of my head who writes that, but it's somebody in the Python + +00:25:13.860 --> 00:25:14.240 +ecosystem. + +00:25:14.720 --> 00:25:18.480 +And they have a whole series of articles on how to optimize your Docker container. + +00:25:18.960 --> 00:25:22.440 +And that sounds really complicated, but you end up with a Docker file that's like 20 lines + +00:25:22.530 --> 00:25:23.260 +long or something. + +00:25:23.450 --> 00:25:26.280 +So it's not like there's crazy things. + +00:25:26.290 --> 00:25:28.140 +It's just you have to know how to structure it. + +00:25:28.330 --> 00:25:30.100 +And then I just copy and paste that to the next project. + +00:25:30.380 --> 00:25:30.560 +Nice. + +00:25:30.820 --> 00:25:30.900 +Yeah. + +00:25:31.300 --> 00:25:34.360 +I resisted doing Docker for a long time because I'm like, I don't want that extra complexity. + +00:25:34.550 --> 00:25:38.420 +But then I realized the stuff you put in the Docker file is really what you just type in + +00:25:38.420 --> 00:25:40.260 +the terminal once and then you forget. + +00:25:41.140 --> 00:25:44.580 +I mean, always using Postgres, Redis, probably if I need some background. + +00:25:44.680 --> 00:25:51.120 +tasks, just plain SMTP server for email. I wrote for all three of those things. I wrote new + +00:25:51.440 --> 00:25:55.920 +extensions in the Flask ecosystem that I'm trying to get more people to know about now. So Flask + +00:25:56.060 --> 00:26:01.920 +SQLAlchemy Lite, L-I-T-E, instead of Flask SQLAlchemy, takes a much more lightweight approach to + +00:26:02.360 --> 00:26:08.100 +integrating SQLAlchemy with Flask. And then Flask Redis, I revived from like 10 years of + +00:26:08.420 --> 00:26:12.699 +non-maintenance. And then I wrote this whole system, this whole pluggable email system called + +00:26:12.720 --> 00:26:18.560 +email simplified, kind of inspired by Django, Django's pluggable system, except, and so there's + +00:26:18.560 --> 00:26:23.560 +like Flask email simplified to integrate that with Flask. But unlike Django, you can use email + +00:26:23.780 --> 00:26:27.580 +simplified in any library you're writing, in any Python application you're writing. It doesn't have + +00:26:27.580 --> 00:26:32.780 +to be a Flask web framework. It's pluggable as the library itself. And then you can also integrate + +00:26:32.830 --> 00:26:38.640 +it with Flask or something else. So Flask email simplified. I get like three downloads a month + +00:26:38.640 --> 00:26:43.240 +right now. So it needs some popularity. Awesome. I've been doing the non-simplified email lately. + +00:26:43.440 --> 00:26:48.620 +So I'm happy to hear that there might be a better way. Yeah. I think people do underappreciate just + +00:26:48.720 --> 00:26:53.780 +how much performance you got out of Python web apps. You know, they're like, oh, we're going to + +00:26:53.800 --> 00:26:59.340 +need to rewrite this and something else because the GIL or whatever. Like I decided just to make + +00:26:59.340 --> 00:27:04.499 +a point to pull up the tail till my log running court, by the way. And each one of these requests + +00:27:04.520 --> 00:27:10.120 +doing like multiple db calls and it's like 23 milliseconds six milliseconds three milliseconds + +00:27:10.600 --> 00:27:15.840 +you know nine milliseconds it's like that's good enough for that's a lot of requests per second + +00:27:16.320 --> 00:27:21.800 +per worker until you gotta you gotta have a lot of traffic speaking of court phil what's your take + +00:27:21.940 --> 00:27:27.860 +on this one i think it's very similar i also build docker containers and uh with a postgres database + +00:27:27.980 --> 00:27:34.480 +on the back end and i run hypercorn as the ascii server and put them behind a aws load balancer + +00:27:34.560 --> 00:27:35.840 +and just run them in ECS. + +00:27:36.080 --> 00:27:37.840 +And I think it's pretty simple, + +00:27:38.120 --> 00:27:39.680 +but I guess it depends on your biases. + +00:27:39.960 --> 00:27:41.840 +But yeah, that's all we do really. + +00:27:41.880 --> 00:27:42.760 +And it goes a long way. + +00:27:42.940 --> 00:27:44.800 +There are multiple ECS tasks, + +00:27:45.240 --> 00:27:47.480 +mostly because if one falls over rather than scaling, + +00:27:47.760 --> 00:27:50.240 +it's usually the database that you need to scale, I find. + +00:27:50.520 --> 00:27:51.940 +But yeah, that's how we run it. + +00:27:52.180 --> 00:27:53.940 +The nice thing for me about Hypercorn + +00:27:54.280 --> 00:27:55.700 +is that I can play with HTTP 3. + +00:27:56.040 --> 00:27:57.320 +So that's what we're doing at times. + +00:27:57.640 --> 00:27:58.780 +Oh, HTTP 3, okay. + +00:27:59.380 --> 00:28:01.240 +I've just been getting my HTTP 2 game down, + +00:28:01.780 --> 00:28:02.980 +so I'm already behind the game. + +00:28:03.860 --> 00:28:04.820 +What's the deal with HTTP/3? + +00:28:05.080 --> 00:28:08.400 +It's obviously a totally new way of doing it over UDP now + +00:28:08.720 --> 00:28:09.520 +rather than TCP. + +00:28:10.340 --> 00:28:11.720 +Although at the application level, + +00:28:11.960 --> 00:28:13.180 +you can't tell any difference really. + +00:28:13.460 --> 00:28:15.480 +But I mean, I just find it interesting. + +00:28:15.660 --> 00:28:17.580 +I'm not really sure it will help too much. + +00:28:17.920 --> 00:28:21.140 +And it's probably best if you've got users who have not that + +00:28:21.340 --> 00:28:22.100 +great a network connection. + +00:28:22.600 --> 00:28:25.840 +But for most other cases, I don't think it matters too much. + +00:28:25.880 --> 00:28:29.140 +Just keep blasting packets until some of them get through. + +00:28:29.940 --> 00:28:30.320 +OK, fine. + +00:28:30.320 --> 00:28:31.200 +We'll give you a page eventually. + +00:28:31.380 --> 00:28:32.260 +There's three pages, actually. + +00:28:32.620 --> 00:28:43.300 +All right, Sebastian, you are running not just FastAPI from your experience, but you're running FastAPI for a ton of people through FastAPI Cloud at, I'm sure, many different levels. + +00:28:43.580 --> 00:28:47.660 +This probably sounds like a shameless plug, and it kind of is, but it's sort of expected. + +00:28:48.120 --> 00:28:50.980 +I will deploy FastAPI or FastAPI Cloud. + +00:28:51.520 --> 00:28:54.760 +Just because, well, the idea is just to make it super simple to do that. + +00:28:54.960 --> 00:28:58.440 +You know, like if you are being able to run the command FastAPI run. + +00:28:58.820 --> 00:29:01.540 +So FastAPI run has like the production server + +00:29:01.600 --> 00:29:03.020 +that is using Ubicorn underneath. + +00:29:03.440 --> 00:29:04.680 +And if you can run that, + +00:29:04.800 --> 00:29:06.420 +then you can run also FastAPI deploy. + +00:29:06.740 --> 00:29:10.160 +And then like, you know, like it will most probably just work. + +00:29:10.600 --> 00:29:13.260 +And, you know, we just wrap everything and like deploy, + +00:29:13.740 --> 00:29:16.580 +build, install, deploy, handle HTTPS, all the stuff + +00:29:17.020 --> 00:29:19.100 +without needing any Docker file or anything like that. + +00:29:19.320 --> 00:29:20.720 +And I think for many use cases, + +00:29:20.900 --> 00:29:22.880 +it's just like simpler being able just to do that. + +00:29:23.100 --> 00:29:25.280 +There are so many projects that I have been now building, + +00:29:25.520 --> 00:29:29.660 +like random stuff that is not really important, but now I can. + +00:29:30.220 --> 00:29:34.580 +And before it was like, yeah, well, I know how to deploy this thing like fully with like + +00:29:34.700 --> 00:29:37.380 +all the bells and whistles, but it's just so much work that yeah, I know later. + +00:29:38.000 --> 00:29:40.700 +So for that, I would end up just like going with that. + +00:29:41.160 --> 00:29:42.140 +Now if I didn't... + +00:29:42.140 --> 00:29:46.580 +Well, what I was going to ask is how much are you willing to tell us how things run inside + +00:29:47.140 --> 00:29:47.940 +FastAPI Cloud? + +00:29:48.280 --> 00:29:52.180 +Oh, I can't, it's just so much stuff that is going on. + +00:29:52.200 --> 00:29:56.780 +And it's also, it's fun that nowadays that they're like, + +00:29:57.340 --> 00:30:00.380 +we have Docker and we have Docker Swarm and there was Nomad + +00:30:00.540 --> 00:30:02.720 +and Kubernetes and oh, Kubernetes won. + +00:30:02.810 --> 00:30:05.940 +And then we have the cloud providers and there's AWS + +00:30:06.460 --> 00:30:07.840 +and Google and Azure. + +00:30:08.290 --> 00:30:10.800 +And you will expect that all these things + +00:30:11.120 --> 00:30:14.360 +and all this complexity is like, now that it's like, okay, + +00:30:14.540 --> 00:30:15.420 +these are the clear winners. + +00:30:15.640 --> 00:30:17.400 +So it's like a lot of complexity to take on, + +00:30:17.470 --> 00:30:20.980 +but once you do it all works, but it doesn't. + +00:30:21.200 --> 00:30:27.360 +And it's just like so much work to get things to work together, to work correctly. + +00:30:27.920 --> 00:30:32.380 +And the official resources from the different providers and things, + +00:30:32.750 --> 00:30:36.540 +in many cases, it's like, oh, the solution is hidden in this issue somewhere in GitHub + +00:30:36.700 --> 00:30:38.240 +because the previous version was obsolete, + +00:30:38.460 --> 00:30:43.180 +but now the new version of this package or whatever is like, it's just, it's crazy. + +00:30:43.620 --> 00:30:50.400 +But like, yeah, so if I didn't have FastAPI Cloud, I will probably use containers. + +00:30:50.660 --> 00:30:51.460 +I will probably use Docker. + +00:30:51.940 --> 00:30:52.980 +If it's like something simple, + +00:30:53.030 --> 00:30:54.860 +I will deploy with Docker Compose, + +00:30:55.000 --> 00:30:57.400 +probably try to scale minimum replicas. + +00:30:57.920 --> 00:30:59.840 +I don't remember Docker Compose has that. + +00:30:59.940 --> 00:31:01.840 +I remember that Docker Swarm had that, + +00:31:01.940 --> 00:31:04.900 +but then Docker Swarm sort of lost against where Net is. + +00:31:05.420 --> 00:31:08.320 +I will put a traffic load balancer in front + +00:31:08.390 --> 00:31:11.840 +to handle HTTPS and, yeah, well, like regular load balancing. + +00:31:12.970 --> 00:31:14.700 +And, yeah, just regular YubiCorn. + +00:31:14.860 --> 00:31:17.800 +What some of the folks we were talking about before, + +00:31:18.440 --> 00:31:22.200 +At some point, we needed to have Unicorn on top of Uvicorn + +00:31:22.300 --> 00:31:24.580 +because Uvicorn wouldn't be able to handle workers. + +00:31:25.040 --> 00:31:27.240 +But now Uvicorn can handle its workers like everything + +00:31:27.500 --> 00:31:30.400 +and handle the main thing was some VE processes + +00:31:30.600 --> 00:31:33.980 +and reaping the processes and handling the stuff. + +00:31:34.300 --> 00:31:35.360 +Now it can't just do that. + +00:31:35.380 --> 00:31:37.460 +So you can just run plain Uvicorn. + +00:31:37.740 --> 00:31:40.100 +So if you're using FastAPI and you say FastAPI run, + +00:31:40.560 --> 00:31:41.520 +that already does that. + +00:31:41.660 --> 00:31:42.920 +So if you're deploying on your own, + +00:31:42.960 --> 00:31:44.580 +you can just use the FastAPI run command. + +00:31:44.940 --> 00:31:46.940 +Then, of course, you have to deal with the scaling + +00:31:47.160 --> 00:31:48.920 +and HTTPS and a lot of balancing + +00:31:48.970 --> 00:31:49.600 +and all the stuff, + +00:31:49.700 --> 00:31:51.620 +but the core server, + +00:31:51.750 --> 00:31:53.260 +you can just run it directly. + +00:31:53.640 --> 00:31:54.980 +If going beyond that, + +00:31:55.080 --> 00:31:56.040 +then there will probably be + +00:31:56.290 --> 00:31:57.820 +some cluster Kubernetes + +00:31:58.560 --> 00:31:59.760 +and trying to scale things, + +00:32:00.120 --> 00:32:01.760 +figure out the ways to scale things + +00:32:02.420 --> 00:32:04.460 +based on the load of the requests, + +00:32:05.950 --> 00:32:06.940 +like scaling automatically. + +00:32:08.020 --> 00:32:10.580 +Having normally one container per process + +00:32:10.900 --> 00:32:12.720 +to be able to scale that more dynamically + +00:32:13.000 --> 00:32:14.260 +without depending on the local memory + +00:32:14.400 --> 00:32:15.200 +for each one of the servers + +00:32:15.270 --> 00:32:15.960 +and things like that, + +00:32:16.260 --> 00:32:17.300 +I'm probably saying too much. + +00:32:17.640 --> 00:32:20.840 +But yeah, actually, you know, like if I didn't have a CPI cloud, + +00:32:20.980 --> 00:32:26.940 +I will probably use one of the providers that abstract those things a little bit away, + +00:32:27.060 --> 00:32:31.100 +you know, like render, railway, fly, like, I don't know. + +00:32:31.260 --> 00:32:36.280 +Like, I don't really think that a regular developer should be dealing with, + +00:32:36.280 --> 00:32:39.140 +you know, like the big hyperscalers and like Kubernetes + +00:32:39.360 --> 00:32:42.560 +and like all that complexity for a common app. + +00:32:42.760 --> 00:32:47.360 +Most of the cases, I think it's just really too much complexity to real with. + +00:32:47.440 --> 00:32:51.940 +It's kind of eye-watering to open up the AWS console or Azure or something. + +00:32:52.310 --> 00:32:52.440 +Whoa. + +00:32:52.980 --> 00:32:59.000 +Oh, the other day, you know, like the other day I had to, in one of the AWS accounts, I had to change the account email. + +00:32:59.500 --> 00:33:00.840 +I think I spent four hours. + +00:33:01.080 --> 00:33:01.420 +I know. + +00:33:01.640 --> 00:33:05.820 +Because I had to create the delegate account that has the right permissions to roll. + +00:33:05.960 --> 00:33:08.140 +And they're like, oh, no, this is, you know, like, + +00:33:08.920 --> 00:33:11.520 +sometimes it's just overwhelming the amount of complexity + +00:33:11.820 --> 00:33:12.920 +that needs to be dealt with. + +00:33:13.660 --> 00:33:16.280 +And, yeah, I mean, it's great to really have, like, + +00:33:16.440 --> 00:33:20.300 +you know, like the infra people that I have working with me + +00:33:20.300 --> 00:33:22.780 +at the company that I can deal with all that mess + +00:33:22.920 --> 00:33:26.020 +and, like, can make sure that everything is just running perfectly + +00:33:26.220 --> 00:33:27.280 +and it just works. + +00:33:27.440 --> 00:33:30.100 +So it's like, you know, like, sort of SRE as a service, + +00:33:30.520 --> 00:33:32.440 +DevOps as a service for everyone. + +00:33:32.860 --> 00:33:35.600 +It's like a cloud product that provides DevOps as a service, + +00:33:36.620 --> 00:33:41.700 +I spent a number of years doing nothing but cloud migrations to these hyperscalers for enterprises. + +00:33:42.420 --> 00:33:47.460 +And I can tell you that when you mentioned the eye-watering comment about the network and all that stuff, + +00:33:48.260 --> 00:33:49.760 +it's so incredibly complicated now, right? + +00:33:49.940 --> 00:33:55.000 +There's literally every kind of concept that you need to know to deploy these enterprises now, + +00:33:55.180 --> 00:33:56.700 +move them from on-prem to the cloud. + +00:33:56.900 --> 00:33:58.640 +So it does get incredibly complicated. + +00:33:58.770 --> 00:34:02.680 +Having something simple like what Sebastian is talking about, I think, is super helpful + +00:34:02.700 --> 00:34:04.380 +when you're just trying to get started + +00:34:04.380 --> 00:34:06.060 +and get something up and running quickly. + +00:34:06.240 --> 00:34:07.140 +I've got a lot of questions + +00:34:07.340 --> 00:34:09.899 +and I realize that we will not be getting through all of them. + +00:34:10.040 --> 00:34:12.440 +So I want to pick carefully. + +00:34:12.720 --> 00:34:14.840 +So let's do this one next. + +00:34:15.220 --> 00:34:18.820 +Performance, what's your best low effort tip? + +00:34:18.960 --> 00:34:20.899 +Not like something super complicated, + +00:34:21.360 --> 00:34:23.679 +but I know there's a bunch of low hanging fruit + +00:34:23.840 --> 00:34:25.540 +that people maybe missed out on. + +00:34:26.020 --> 00:34:27.760 +And this time let's start with Litestar. + +00:34:28.159 --> 00:34:29.240 +Cody, back at you. + +00:34:29.480 --> 00:34:30.760 +I'm going to stick to what I know, + +00:34:30.820 --> 00:34:32.520 +which is databases because I deal with that. + +00:34:32.580 --> 00:34:38.060 +every single day. There's a couple of things that I see as like gotchas that I constantly see over + +00:34:38.060 --> 00:34:45.240 +and over. One, SQLAlchemy kind of obfuscates the way it's going to execute things and what kind of + +00:34:45.379 --> 00:34:49.899 +queries it's going to actually execute. So it's really easy if you're not kind of fluent in how + +00:34:49.909 --> 00:34:55.360 +it works to create N plus one types of issues. And so when people start talking about sync or async, + +00:34:55.500 --> 00:34:59.280 +it's really, in my mind, it's less of that because you're going to spend more time waiting on the + +00:34:59.300 --> 00:35:04.200 +network and database and those kind of things, then you're going to spend serializing just + +00:35:04.500 --> 00:35:09.920 +generally, right? And or processing things on the web framework. And so, one, making sure that you, + +00:35:10.060 --> 00:35:15.340 +your relationships dialed in correctly so that you don't have N plus one queries. The other thing is + +00:35:15.560 --> 00:35:21.260 +oversized connection pooling into Postgres and just databases in general, because what people don't + +00:35:21.380 --> 00:35:26.240 +tend to know is that each of those connections takes up CPU cycles and RAM of the database. + +00:35:26.520 --> 00:35:33.360 +And so when you slam the database with hundreds of connections, you're just taking away processing power that can be done for other things, right? + +00:35:33.490 --> 00:35:35.740 +And so you end up ultimately slowing things down. + +00:35:35.830 --> 00:35:44.100 +So I've seen databases that have had so many connections that all of the CPU and all the stuff is actually doing things, just managing connections and can't actually do any database work. + +00:35:44.270 --> 00:35:45.460 +And so what about this socket? + +00:35:45.620 --> 00:35:46.180 +Is it busy? + +00:35:46.260 --> 00:35:46.800 +What about this socket? + +00:35:46.880 --> 00:35:47.260 +Is it busy? + +00:35:47.340 --> 00:35:48.580 +It's just round robin that, right? + +00:35:48.940 --> 00:35:51.980 +Paying attention to the database is kind of my first kind of rule of thumb. + +00:35:52.120 --> 00:35:52.480 +100%. + +00:35:52.610 --> 00:35:53.660 +I like that one a lot. + +00:35:53.700 --> 00:36:01.660 +I'll throw in putting stuff or identifying work that doesn't need to be done immediately for the user and putting in a background task. + +00:36:02.090 --> 00:36:04.700 +Having a background worker defer things till later. + +00:36:05.190 --> 00:36:10.060 +So sending email is an example, although there's nuances there about knowing that it's sent and everything. + +00:36:10.380 --> 00:36:20.460 +But yeah, if you user kicks off some process and then you wait to do that process in the worker, you're holding that worker up, which is more relevant in WSGI than ASGI. + +00:36:20.540 --> 00:36:26.300 +but and you're making them wait for their page to load again versus record what they wanted to do + +00:36:26.500 --> 00:36:30.680 +send it off to the background let them see the status of it but let the background worker handle + +00:36:30.700 --> 00:36:35.260 +it all right yeah like i said as you guys go for it i'm not sure if that's some sort of + +00:36:35.420 --> 00:36:41.380 +it's not really a trick or a tip or more more like a i think the most common mistake i see when i + +00:36:41.560 --> 00:36:46.200 +is ascii specific but when i look at ascii apps that people have written who are maybe not as + +00:36:46.220 --> 00:36:52.760 +familiar with ASCII or async Python at all, if you make something an async function, you should be + +00:36:53.260 --> 00:36:59.220 +absolutely sure that it's non-blocking. Because if you're running an ASCII app and you're blocking + +00:36:59.920 --> 00:37:04.660 +anywhere, your whole application server is blocked completely. It doesn't handle any other requests + +00:37:04.680 --> 00:37:11.839 +at the same time. It's blocked. I don't think I've had any mistake more times when I've looked through + +00:37:11.840 --> 00:37:18.820 +some apps that someone has written or that i've came across somewhere so this is really it's super + +00:37:18.980 --> 00:37:26.280 +super common and it has such a such a big impact on the overall performance in every every metric + +00:37:26.880 --> 00:37:33.200 +imaginable so i would say unless and that's nowadays what i tell people unless you're 100 + +00:37:33.440 --> 00:37:39.280 +sure that you know what you're doing and you know it's it's non-blocking don't make it async put it + +00:37:39.120 --> 00:37:41.700 +in a thread pool, execute it in a thread, whatever. + +00:37:42.290 --> 00:37:44.580 +All of the ASCII frameworks and Django + +00:37:45.130 --> 00:37:48.840 +give you a lot of tools at hand to translate your stuff + +00:37:49.520 --> 00:37:52.080 +to from sync to async so you can still run it. + +00:37:52.240 --> 00:37:56.600 +Do that unless you're very sure that it actually fully supports + +00:37:57.080 --> 00:37:57.260 +async. + +00:37:57.290 --> 00:37:58.340 +MARK MANDEL: Yeah, that's good advice. + +00:37:58.700 --> 00:37:59.120 +Sebastian. + +00:37:59.650 --> 00:38:02.120 +SEBASTIAN BASTIAN: Hey, I'm actually going to second, Yannick. + +00:38:02.800 --> 00:38:04.620 +I think, yeah, like it's-- + +00:38:04.960 --> 00:38:08.020 +and it's maybe counterintuitive that one + +00:38:08.040 --> 00:38:12.880 +of the tips of performance is to try to not optimize that much performance at the beginning. + +00:38:13.340 --> 00:38:17.740 +You know, like, I think the idea with async is like, oh, you can get so much performance + +00:38:17.980 --> 00:38:19.960 +and throughput in terms of accuracy, whatever. + +00:38:20.360 --> 00:38:26.720 +But the thing is, in most of the cases, you know, like, till apps grow so large, they + +00:38:26.900 --> 00:38:30.420 +actually don't need that much extra throughput, that much extra performance. + +00:38:30.920 --> 00:38:35.860 +And in a framework like, you know, like, as Yannick was saying, well, in my case, I know + +00:38:35.880 --> 00:38:38.080 +FastAPI, but like, you know, like also many others. + +00:38:38.520 --> 00:38:41.320 +If you define the function with async, it's going to be run async. + +00:38:41.400 --> 00:38:43.840 +If you define it non-async and regular def, + +00:38:44.140 --> 00:38:46.400 +it's going to be run on a thread worker automatically. + +00:38:46.960 --> 00:38:50.440 +So it's just going to do the smart thing automatically. + +00:38:50.700 --> 00:38:53.880 +So it's like fair, you know, like it's going to be good enough. + +00:38:54.340 --> 00:38:58.300 +And then you can just start with that and just keep blocking code everywhere. + +00:38:58.380 --> 00:39:01.340 +You know, like just not use async until you actually know + +00:39:01.480 --> 00:39:03.280 +that you really need to use async. + +00:39:03.900 --> 00:39:05.720 +And once you do, you have to be, as Yannick was saying, + +00:39:05.780 --> 00:39:10.900 +you know, like 100% sure that you are not running blocking code inside of it. + +00:39:11.220 --> 00:39:14.420 +And if you need to run blocking code inside of Async code, + +00:39:14.650 --> 00:39:17.360 +then make sure that you are sending it to a thread worker. + +00:39:17.680 --> 00:39:20.420 +Sending it to a thread worker sounds the own thing, + +00:39:20.620 --> 00:39:22.420 +but yeah, like, you know, like Django has tools, + +00:39:23.000 --> 00:39:23.720 +any IO has tools. + +00:39:23.960 --> 00:39:27.000 +I also built something on top of any IO called AsyncR, + +00:39:27.200 --> 00:39:29.100 +that is just to simplify these things, + +00:39:29.300 --> 00:39:31.000 +to asyncify a blocking function, + +00:39:31.440 --> 00:39:33.160 +keeping all the type information + +00:39:33.360 --> 00:39:35.460 +so that you get autocompletion and inline errors and everything. + +00:39:35.860 --> 00:39:37.600 +even though it's actually doing all the stuff + +00:39:37.760 --> 00:39:40.320 +of sending the thing to the thread work. + +00:39:40.760 --> 00:39:41.980 +So the code is super simple. + +00:39:42.070 --> 00:39:43.380 +You keep very simple code, + +00:39:43.410 --> 00:39:44.940 +but then underneath it's just like doing + +00:39:45.340 --> 00:39:46.360 +all the stuff that should be done. + +00:39:46.700 --> 00:39:48.000 +But you know, like that's normally + +00:39:48.160 --> 00:39:50.720 +when you actually need to hyper-optimize things. + +00:39:51.040 --> 00:39:51.800 +In most of the cases, + +00:39:51.910 --> 00:39:54.940 +you can just start with just not using async at first. + +00:39:55.400 --> 00:39:57.800 +Also, now that you're going to have Python multi-threaded, + +00:39:58.120 --> 00:39:59.760 +then suddenly you're going to have + +00:40:00.000 --> 00:40:02.800 +just so much more performance out of the blue + +00:40:02.910 --> 00:40:04.840 +without even having to do much more. + +00:40:05.180 --> 00:40:12.220 +So, yeah, actually that's, you know, like, sorry, I kept speaking so much, but here's a tip for improving performance. + +00:40:12.860 --> 00:40:14.220 +Upgrade your Python version. + +00:40:14.910 --> 00:40:17.220 +I was just chatting today with Savannah. + +00:40:17.760 --> 00:40:28.440 +She was adding the benchmarks to the, you know, like Python benchmark, the official Python benchmarks that they run for the CPython, the faster CPython program. + +00:40:29.140 --> 00:40:35.540 +And the change from Python 3.10 to Python 3.14 + +00:40:36.060 --> 00:40:39.800 +when running FastAPI is like almost double the performance + +00:40:40.020 --> 00:40:40.660 +or something like that. + +00:40:40.940 --> 00:40:42.060 +It was like, it was crazy. + +00:40:42.220 --> 00:40:44.640 +It was just crazy improvement in performance. + +00:40:44.840 --> 00:40:46.520 +So you can just upgrade your Python version. + +00:40:46.660 --> 00:40:50.320 +You're gonna get so much better performance just out of that. + +00:40:50.400 --> 00:40:52.000 +- Yeah, that's an awesome piece of advice + +00:40:52.140 --> 00:40:53.540 +that I think is often overlooked. + +00:40:53.980 --> 00:40:55.480 +And it's not only CPU speed, + +00:40:55.680 --> 00:40:57.180 +it's also memory gets a lot lower. + +00:40:57.600 --> 00:40:58.420 +Whoever's gonna jump in, go ahead. + +00:40:58.580 --> 00:41:03.840 +Last year, I was looking at MarkupSafe, which is an HTML escaping library that we use and + +00:41:04.230 --> 00:41:05.700 +has a C extension for speedups. + +00:41:05.770 --> 00:41:11.460 +And I almost convinced myself that I can stop maintaining the C extension because just Python + +00:41:11.720 --> 00:41:12.780 +itself got way faster. + +00:41:13.050 --> 00:41:16.700 +But then it turned out that I could do something to the C extension to make it faster also. + +00:41:17.110 --> 00:41:17.780 +So I'm still maintaining. + +00:41:18.100 --> 00:41:23.140 +But just the fact that I almost convinced myself like, oh, I can drop a C extension for just + +00:41:23.140 --> 00:41:26.160 +a Python upgrade instead was pretty impressive. + +00:41:26.740 --> 00:41:29.320 +They've done a lot, especially with like string handling + +00:41:29.400 --> 00:41:30.840 +and, you know, which you're going to use + +00:41:30.920 --> 00:41:32.360 +for templating for web apps. + +00:41:32.660 --> 00:41:32.780 +Phil. + +00:41:32.920 --> 00:41:36.020 +Yeah, well, I definitely echo looking at your DB queries + +00:41:36.300 --> 00:41:38.260 +because by far and large, that's always where + +00:41:38.640 --> 00:41:39.720 +our performance issues have been. + +00:41:39.800 --> 00:41:41.920 +It's either badly written query or we're returning + +00:41:42.180 --> 00:41:43.920 +most of the database when the user just wants to know + +00:41:44.020 --> 00:41:45.800 +about one thing or something silly like that. + +00:41:46.240 --> 00:41:47.520 +I was thinking about low-hanging ones, + +00:41:47.620 --> 00:41:48.440 +which I think you asked about. + +00:41:48.740 --> 00:41:52.560 +So I'd say uv loop, which is still a noticeable improvement. + +00:41:53.740 --> 00:41:58.720 +And also, because I think it's likely a lot of us are returning JSON, often changing the + +00:41:59.100 --> 00:42:04.080 +JSON serializer to one of the faster ones can be noticeable as well and obviously quite easy to do. + +00:42:04.330 --> 00:42:05.640 +So yeah, that's my key. + +00:42:05.760 --> 00:42:06.460 +That's really good advice. + +00:42:06.570 --> 00:42:08.360 +I didn't think about the JSON serializer. + +00:42:08.820 --> 00:42:09.560 +What one do you recommend? + +00:42:09.900 --> 00:42:11.040 +I think, is it you, JSON? + +00:42:11.420 --> 00:42:12.280 +Or is it all JSON? + +00:42:12.860 --> 00:42:14.200 +I can't remember which one was deprecated. + +00:42:15.880 --> 00:42:21.180 +But yeah, if you look at the Tech Empower benchmarks, everyone's changing the JSON serializer + +00:42:21.310 --> 00:42:22.540 +to get that bit extra speed. + +00:42:22.860 --> 00:42:27.440 +But yeah, you're like, our framework looks bad because our JSON serializer is like third + +00:42:27.470 --> 00:42:28.020 +of the performance. + +00:42:28.270 --> 00:42:31.680 +We changed, well, David added a JSON provider to Flask. + +00:42:31.880 --> 00:42:35.120 +And yeah, you could see it make a difference in the tech and power benchmarks. + +00:42:35.400 --> 00:42:35.960 +So that was really good. + +00:42:36.060 --> 00:42:36.360 +Yeah, cool. + +00:42:36.540 --> 00:42:37.600 +Yeah, it's pluggable now. + +00:42:37.860 --> 00:42:42.880 +But if you're not installing Flask or JSON, I mean, I don't know what other JSON library + +00:42:43.000 --> 00:42:44.780 +you'd be using at this point, unless you're already using one. + +00:42:45.140 --> 00:42:47.240 +But or JSON is very, very fast. + +00:42:47.520 --> 00:42:49.200 +Okay, this is something I'm going to be looking at you later. + +00:42:49.660 --> 00:42:57.440 +So over to Django, Jeff, David talked about running stuff in the background and was it Django 5 or Django 6 that got the background task thing? + +00:42:57.640 --> 00:42:59.640 +Yeah, Django 6 just came out a couple of weeks ago. + +00:43:00.200 --> 00:43:07.220 +And I'll hand that off to Carlton in a second because I think Carlton's had more to do with the actual plumbing being on the steering council. + +00:43:07.620 --> 00:43:11.860 +My advice to people is the best way to scale something is just to not do it, avoid the process completely. + +00:43:12.040 --> 00:43:16.000 +So like I mentioned to CDN earlier, it's content heavy sites, cache the crap out of stuff. + +00:43:16.500 --> 00:43:17.540 +It doesn't even have to hit your servers. + +00:43:17.960 --> 00:43:22.020 +You can go a lot, as we mentioned earlier, too, just by doubling the amount of resources a project has. + +00:43:22.400 --> 00:43:24.840 +Django is pretty efficient these days, especially with async views. + +00:43:25.290 --> 00:43:31.500 +Like everybody else has said, too, any blocking code, move off to threads, move off to a background queue. + +00:43:31.820 --> 00:43:35.140 +Django Q2 is my favorite one to use because you can use a database. + +00:43:35.780 --> 00:43:39.580 +So for those little side projects where you just want to run one or two processes, you can use it. + +00:43:39.580 --> 00:43:40.040 +It works great. + +00:43:40.580 --> 00:43:43.160 +And Carlton, if you want to talk about Django internals. + +00:43:43.260 --> 00:43:43.740 +Yeah, OK. + +00:43:43.900 --> 00:43:51.120 +So the new task framework I just mentioned, the main thing, the main sort of bit about it is that it's, again, this pluggable Django API. + +00:43:51.360 --> 00:43:53.080 +So it gives a standard task API. + +00:43:53.220 --> 00:43:57.120 +So if you're writing a third-party library and you, I know, you need to send an email. + +00:43:57.540 --> 00:43:58.820 +It's the canonical example, right? + +00:43:58.820 --> 00:44:00.880 +You need to send an email in your third-party library. + +00:44:01.320 --> 00:44:07.780 +Before, you'd have had to tie yourself to a specific queue implementation, whereas now Django is providing a kind of like an ORM of tasks. + +00:44:07.820 --> 00:44:08.180 +Right, right. + +00:44:08.240 --> 00:44:11.780 +You got to do Redis, you got to do Celery, and you got to manage things and all that. + +00:44:11.840 --> 00:44:14.820 +You don't have to pick that now as the third-party package author. + +00:44:14.900 --> 00:44:16.420 +You can just say, right, just use Django, + +00:44:17.080 --> 00:44:18.640 +wrap this as a Django task and queue it. + +00:44:18.940 --> 00:44:22.780 +And then the developer, when they come to choose their backend, + +00:44:22.880 --> 00:44:24.940 +if they want to use Celery or they want to use Django Q2 + +00:44:25.060 --> 00:44:26.720 +or they want to use the Django task backend, + +00:44:26.900 --> 00:44:30.440 +which Jake Howard, who wrote this for Django provided as well, + +00:44:30.800 --> 00:44:32.040 +you can just plug that in. + +00:44:32.300 --> 00:44:34.700 +So it's a pluggable interface for tasks, + +00:44:34.980 --> 00:44:37.240 +which is, I think, the really nice thing about it. + +00:44:37.600 --> 00:44:40.700 +In terms of quick wins, everybody's mentioned almost all of mine. + +00:44:40.840 --> 00:44:43.420 +I'm going to, Cody and Phil, they mentioned the database. + +00:44:43.920 --> 00:44:44.600 +That's the big one. + +00:44:44.800 --> 00:44:48.280 +Django, the ORM, because it does lazy related lookups, + +00:44:48.420 --> 00:44:51.180 +it's very easy to trigger in M plus one where, you know, + +00:44:51.860 --> 00:44:55.000 +the book has multiple authors and suddenly you're iterating through the books + +00:44:55.160 --> 00:44:57.120 +and you're iterating through the authors and it's a lookup. + +00:44:57.640 --> 00:45:00.360 +So you need to do things like prefetch related, select related. + +00:45:00.470 --> 00:45:02.060 +You need to just check that you've got those. + +00:45:02.240 --> 00:45:04.880 +Django debug toolbars are a great thing to run in development + +00:45:05.040 --> 00:45:07.700 +where you can see the queries and it'll tell you where you've got the duplicates. + +00:45:08.280 --> 00:45:11.100 +And then the slightly bigger one is to just check your indexes. + +00:45:11.350 --> 00:45:14.300 +The ORM will create the right indexes if you're leaning, + +00:45:14.570 --> 00:45:16.520 +if you're going through primary keys or unique fields. + +00:45:16.590 --> 00:45:19.220 +But sometimes you're doing a filter on some field, + +00:45:19.800 --> 00:45:21.460 +and then there's not the right index there, + +00:45:21.460 --> 00:45:22.540 +and that can really slow you down. + +00:45:22.590 --> 00:45:26.460 +So again, you can do the SQL explain on that and find that. + +00:45:26.860 --> 00:45:29.820 +And then the thing I was going to say originally was caching, + +00:45:30.160 --> 00:45:34.480 +is get a Redis instance, stick it next to your Django app, + +00:45:34.530 --> 00:45:36.620 +and as Jeff said, don't do the work. + +00:45:36.740 --> 00:45:40.000 +If you're continually rendering the same page and it never changes, + +00:45:40.620 --> 00:45:42.980 +cache it and pull it from the cache rather than rendering. + +00:45:43.120 --> 00:45:45.460 +Because template DB queries are one of your biggest things. + +00:45:45.560 --> 00:45:47.500 +The second one's always going to be serialization. + +00:45:47.620 --> 00:45:49.300 +It's either serialization or template rendering. + +00:45:49.620 --> 00:45:53.760 +So if you can avoid that by caching, you can save an awful lot of time on your account. + +00:45:53.780 --> 00:45:53.900 +Yeah. + +00:45:54.360 --> 00:45:56.740 +I was wondering if somebody would come back with database indexes, + +00:45:56.900 --> 00:46:00.900 +because that's like a 100x multiplier for free almost. + +00:46:01.200 --> 00:46:03.040 +It's such a big deal. + +00:46:03.160 --> 00:46:03.860 +It really can be. + +00:46:03.900 --> 00:46:06.700 +If you're making a particular query and it's doing a full table scan + +00:46:06.720 --> 00:46:11.400 +all of a sudden you put the index in, it's instant. It's like, oh, wow. You don't have to be a DBA or + +00:46:12.619 --> 00:46:16.400 +master information architect sort of thing. I don't know about Postgres. I'm sure it has it. + +00:46:16.540 --> 00:46:21.720 +Somebody can tell me. But with Mongo, you can turn on in the database, I want you to log all + +00:46:21.940 --> 00:46:27.060 +slow queries and slow for me means 20 millisecond or whatever. Like you put a number in and then + +00:46:27.660 --> 00:46:31.620 +you run your app for a while and you go, look at what's slow by slowest. And then you can see, + +00:46:31.640 --> 00:46:35.220 +well, maybe that needs an index, right? Like just let your app tell you what you got to do. + +00:46:35.500 --> 00:46:36.840 +Yeah, there is a post. + +00:46:37.040 --> 00:46:38.740 +I'm just trying to see if I can quickly look it up now. + +00:46:38.860 --> 00:46:40.060 +There's a Postgres extension, + +00:46:40.390 --> 00:46:42.000 +which will automatically run explain + +00:46:42.680 --> 00:46:44.540 +on the slow queries and log them for you. + +00:46:44.680 --> 00:46:45.360 +So it'll... + +00:46:45.400 --> 00:46:45.660 +There you go. + +00:46:46.400 --> 00:46:47.080 +See if I can find... + +00:46:47.080 --> 00:46:48.240 +It's pgstat statements, I think, + +00:46:48.240 --> 00:46:49.020 +is what you're thinking about. + +00:46:49.160 --> 00:46:49.500 +Right, okay. + +00:46:49.700 --> 00:46:52.280 +If you're unsure about your database indexes, + +00:46:52.620 --> 00:46:55.340 +do this, or at least go back and review your queries. + +00:46:55.640 --> 00:46:56.240 +Yeah, I agree. + +00:46:56.590 --> 00:46:56.780 +Very good. + +00:46:57.240 --> 00:46:59.220 +All right, I can see we're blazing through these questions. + +00:47:00.060 --> 00:47:00.820 +I had one more. + +00:47:01.120 --> 00:47:01.960 +If I can mention one. + +00:47:01.960 --> 00:47:02.540 +No, please go ahead. + +00:47:02.730 --> 00:47:03.260 +Yeah, go ahead, David. + +00:47:03.420 --> 00:47:27.740 +If you want to like get some more responsive parts of your website, like make your website a little more responsive or interactive with the user, HTMX or Datastar, especially like if you're using Cort or another ASGI where you can do SSE, server sent events or WebSockets, streaming little bits of changes to the web front end and then rendering them with the same HTML you're already writing can make things a lot more responsive. + +00:47:28.240 --> 00:47:33.240 +We had talk about that from Chris May at FlaskCon last year, which you can find on YouTube. + +00:47:33.540 --> 00:47:38.280 +This is not one of the questions, but let me just start out for a quick riff on this, folks. + +00:47:38.980 --> 00:47:41.380 +Out in the audience, someone was asking, what about HTMX? + +00:47:41.800 --> 00:47:47.860 +And I think more broadly, I am actually a huge fan of server-side-based, template-based apps. + +00:47:48.060 --> 00:47:51.380 +I think it just keeps things simpler in a lot of ways, unless you need a lot of interactivity. + +00:47:51.740 --> 00:47:57.520 +But things like HTMX or a little bit of JavaScript can reduce a lot of the traffic and stuff. + +00:47:57.580 --> 00:47:59.160 +Where do people land on those kinds of things? + +00:47:59.360 --> 00:48:05.760 +I absolutely love HTMLX, not just because you don't have to write a lot of JavaScript or whatever, + +00:48:06.300 --> 00:48:14.040 +but mostly because if I'm just building a simple app that needs a bit more than just be a static HTML page, + +00:48:14.040 --> 00:48:17.840 +it needs some interactivity, a little bit of reactivity. + +00:48:18.380 --> 00:48:34.160 +I feel like having the whole overhead of building an SPA or whatever tools you need for the whole JavaScript, TypeScript, whatever stack, it's just so much work to get a little bit to make a simple thing a little bit nicer, a little bit more reactive. + +00:48:34.740 --> 00:48:37.540 +And I feel like HTMX just fits right in there. + +00:48:37.700 --> 00:48:39.420 +It's super great. + +00:48:39.520 --> 00:48:41.340 +I've built a couple of things with it now, + +00:48:41.960 --> 00:48:44.760 +a few of my own projects, a few things that work. + +00:48:45.280 --> 00:48:47.460 +And it makes things so much easier + +00:48:47.720 --> 00:48:50.480 +where the work probably wouldn't have been done + +00:48:50.570 --> 00:48:52.320 +if it was just because it's too much. + +00:48:52.420 --> 00:48:55.000 +If you're doing a whole front end thing + +00:48:55.140 --> 00:48:57.120 +that you have then to deploy and build and whatever, + +00:48:57.520 --> 00:48:59.180 +or it would have been less nice. + +00:48:59.640 --> 00:49:02.420 +So it's an amazing, really amazing thing. + +00:49:02.520 --> 00:49:10.460 +As the maintainer and author, though, one of the things that is not frustrating, but it's understandable is that HTMX is not for everybody, right? + +00:49:10.660 --> 00:49:15.020 +There's not like you can't use HTMX in all occasions or Datastar, right? + +00:49:15.090 --> 00:49:20.180 +And so there are people that are always going to want to use React and there's going to be people that want to use all these other frameworks. + +00:49:20.290 --> 00:49:24.120 +And so having some cohesive way to make them all talk together, I think, is important. + +00:49:24.520 --> 00:49:29.800 +I don't have that answer yet, but I just know that like I can't always say HTMX is it, right? + +00:49:29.940 --> 00:49:33.840 +And then you'll have a great time because I'll inevitably meet somebody that says I need to do this. + +00:49:33.910 --> 00:49:37.900 +And it's right. And it's in a single page application or something is more appropriate for that. + +00:49:37.950 --> 00:49:41.020 +And so it's obviously the right tool for the right job when you need it. + +00:49:41.100 --> 00:49:45.560 +But, you know, I want to make something that is cohesive depending on whatever library you want to use. + +00:49:45.730 --> 00:49:46.960 +I would throw one thing in there, though. + +00:49:47.080 --> 00:49:51.160 +I would rather somebody start with HTMX than I would start with React if you don't need it. + +00:49:51.320 --> 00:49:54.140 +Because React can be total overkill. It can be great for some applications. + +00:49:54.720 --> 00:49:59.040 +But oftentimes the consultant, we see people like having an about page and they throw a React at it. + +00:49:59.160 --> 00:49:59.940 +Like, why do you need that? + +00:50:00.340 --> 00:50:02.040 +Like, especially for small things with partials. + +00:50:02.240 --> 00:50:03.500 +Do you mean you don't want to start with Angular? + +00:50:03.780 --> 00:50:05.800 +You know, it's fine if you need it, + +00:50:06.080 --> 00:50:07.740 +but I don't think you really need it. + +00:50:07.920 --> 00:50:09.680 +Like, introduce tools as you need them. + +00:50:10.000 --> 00:50:11.840 +Django 6.0 just added template partials, + +00:50:12.010 --> 00:50:13.940 +and I guess my job here is to hand off to Carlton + +00:50:14.080 --> 00:50:15.160 +because this is his feature. + +00:50:15.320 --> 00:50:16.900 +Yeah, I was happy to see that come in there, Carlton. + +00:50:17.020 --> 00:50:17.540 +Nice job. + +00:50:17.860 --> 00:50:18.620 +No, it's okay. + +00:50:19.230 --> 00:50:19.920 +Plug the new feature. + +00:50:20.100 --> 00:50:23.180 +So, I mean, I stepped down as a fellow in 2023 + +00:50:24.550 --> 00:50:25.160 +into a new business, + +00:50:25.560 --> 00:50:28.299 +and I read the essay about template fragments + +00:50:28.320 --> 00:50:34.180 +on the htmx website where it's this like named reusable bits in the in the templates and i was + +00:50:34.200 --> 00:50:38.600 +like i need that so i built django template partials released as a third-party package and it's now just + +00:50:38.630 --> 00:50:44.460 +been merged into core for django 6.0 and i have to say about htms it's really changed the way i write + +00:50:44.580 --> 00:50:49.020 +websites before i was the fellow i used to write mobile applications and do the back do the front + +00:50:49.120 --> 00:50:52.600 +end of the mobile application then the back end in django using django rest framework and i was + +00:50:52.940 --> 00:50:58.280 +that's how i got into you know open source was by django rest framework and since starting the + +00:50:58.300 --> 00:51:02.780 +we're three years in we've hardly got a json endpoint in sight it's like two three four of + +00:51:02.780 --> 00:51:08.000 +them in the whole application and it's oh it's just a delight again you know you asked me at the + +00:51:08.140 --> 00:51:12.440 +beginning michael am i having fun yeah i really am having fun and htmx is the reason i do grant + +00:51:12.590 --> 00:51:18.400 +there are you know these use cases awesome all right let's talk about our last topic and we have + +00:51:18.740 --> 00:51:24.300 +five minutes ish to do that so we gotta we gotta stay on target quick but let's just go around + +00:51:24.540 --> 00:51:29.780 +run real quick here we talked about upgrading the python version getting better performance out of + +00:51:29.900 --> 00:51:36.660 +it i mentioned the lower memory side but i think one of the underappreciated aspects of this you + +00:51:36.660 --> 00:51:43.360 +know the instagram team did a huge talk you know on it a while ago is the memory that you run into + +00:51:43.560 --> 00:51:47.380 +when you start to scale out your stuff on the server because you're like oh i want to have four + +00:51:47.580 --> 00:51:51.520 +workers so i can have more concurrency because of the gills so now you've got four copies of + +00:51:51.540 --> 00:51:56.500 +everything that you cache in memory and just like the runtime and now you need eight gigs instead of + +00:51:56.500 --> 00:52:02.380 +what would have been one or who knows right but with free threaded python coming on which i've + +00:52:02.380 --> 00:52:07.920 +seen a couple of comments in the chat like hey tell us about this like it's we could have true + +00:52:08.380 --> 00:52:13.420 +concurrency and we wouldn't need to scale as much on the process side i think giving us both better + +00:52:13.600 --> 00:52:17.480 +performance and the ability to say well you actually have four times less memory so you could + +00:52:17.440 --> 00:52:22.660 +run smaller servers or whatever what's the free threaded story for all the frameworks carlton + +00:52:22.780 --> 00:52:26.860 +let's go back to you for do it in reverse i'm really excited about it i don't know how it's + +00:52:26.860 --> 00:52:30.940 +going to play out but i'm really excited about it all it can do is help django the async story in + +00:52:31.040 --> 00:52:36.460 +django is nice and mature now uh but still most of it's sync like you know you're still going to + +00:52:36.560 --> 00:52:39.960 +default the sync you're still going to write your sync views you still got template rendering you + +00:52:39.960 --> 00:52:43.880 +know django's template template based kind of framework really you're still going to want to + +00:52:43.880 --> 00:52:50.180 +run things synchronously, concurrently, and proper threads are going to be, yeah, they can't but help. + +00:52:50.270 --> 00:52:53.920 +I don't know how it's going to roll out. I'll let someone else go because I'm getting locked up. + +00:52:53.920 --> 00:52:59.520 +Yeah, I just like elaborated that for people out there before we move on is you could set up your + +00:52:59.720 --> 00:53:05.100 +worker process to say, I want you to actually run eight threads in this one worker process. + +00:53:05.760 --> 00:53:10.600 +And when multiple requests come in, they could both be sent off to the same worker to be processed. + +00:53:10.740 --> 00:53:15.240 +And that allows that worker to do more unless the GIL comes along and says, stop, you only + +00:53:15.380 --> 00:53:17.060 +get to do one thing in threads in Python. + +00:53:17.640 --> 00:53:18.960 +And all of a sudden, a lot of that falls down. + +00:53:19.100 --> 00:53:22.080 +This basically uncorks that and makes that easy all of a sudden. + +00:53:22.210 --> 00:53:26.120 +Even if you yourself are not writing async, your server can be more async. + +00:53:26.220 --> 00:53:26.420 +Yeah. + +00:53:26.450 --> 00:53:31.220 +And this is the thing that we found with ASCII is that you dispatch to a, you know, using + +00:53:31.420 --> 00:53:36.220 +to async or you dispatch it to a thread, a thread pool executor, but Python doesn't run + +00:53:36.380 --> 00:53:37.180 +that concurrently. + +00:53:37.320 --> 00:53:38.880 +And so it's like, or in parallel. + +00:53:39.100 --> 00:53:42.340 +So it's like, ah, it doesn't actually go as fast as you want it to. + +00:53:42.580 --> 00:53:44.880 +And so you end up wanting multiple processes still. + +00:53:45.080 --> 00:53:45.800 +All right, let's keep it with Django. + +00:53:46.040 --> 00:53:46.680 +Jeff, what do you think? + +00:53:47.160 --> 00:53:48.520 +I'm going to defer to the others on this. + +00:53:48.580 --> 00:53:49.480 +I have the least thoughts. + +00:53:49.760 --> 00:53:51.120 +All right, write down the stack, Sebastian. + +00:53:51.620 --> 00:53:54.540 +Write down the video, not website, web framework. + +00:53:55.760 --> 00:53:56.700 +I think it's going to be awesome. + +00:53:56.880 --> 00:53:58.940 +This is going to help so much, so many things. + +00:53:59.560 --> 00:54:04.780 +The challenge is going to be third-party libraries used by each individual application + +00:54:05.140 --> 00:54:06.740 +and if they are compatible or not. + +00:54:07.020 --> 00:54:08.440 +That's where the challenge is going to be. + +00:54:08.580 --> 00:54:12.760 +But other than that, it's just going to be free extra performance for everyone. + +00:54:13.160 --> 00:54:15.540 +Just, you know, like just upgrading the version of Python. + +00:54:15.720 --> 00:54:16.420 +So that's going to be us. + +00:54:16.480 --> 00:54:16.620 +Cody. + +00:54:16.860 --> 00:54:18.400 +Yeah, I'm going to echo what Sebastian just said. + +00:54:18.460 --> 00:54:21.520 +The third party libraries, I think, are going to be the big kind of sticky point here. + +00:54:21.720 --> 00:54:23.120 +I'm looking forward to seeing what we can do. + +00:54:23.280 --> 00:54:24.880 +I'm going to kind of hold my thoughts on that. + +00:54:24.960 --> 00:54:28.660 +Yannick kind of speak a little bit on it because I know that he's looked at msgspec specifically + +00:54:28.920 --> 00:54:31.720 +and some of the other things that might, you know, give some better context here. + +00:54:31.880 --> 00:54:35.880 +But yes, the third party libraries are going to be the kind of the sticky issue. + +00:54:36.160 --> 00:54:38.720 +but I'm looking forward to seeing what we can make happen. + +00:54:38.880 --> 00:54:43.200 +I'm super excited, actually, specifically about async stuff, + +00:54:43.440 --> 00:54:45.580 +because for most of the time, it's like, + +00:54:46.600 --> 00:54:49.100 +if you can already saturate your CPU, + +00:54:50.260 --> 00:54:51.240 +async doesn't help you much. + +00:54:51.760 --> 00:54:53.500 +Well, now, if you have proper threads, + +00:54:53.560 --> 00:54:56.180 +you can actually do that in async as well. + +00:54:56.840 --> 00:54:59.880 +And I think it's going to speed up a lot of applications + +00:55:00.260 --> 00:55:01.140 +just by default, + +00:55:01.520 --> 00:55:06.120 +because almost all async applications out there + +00:55:06.260 --> 00:55:11.500 +use threads in some capacity because, well, most of things aren't async by nature. + +00:55:12.160 --> 00:55:16.300 +So they will use a thread pool and it will run more concurrently. + +00:55:16.300 --> 00:55:18.360 +And so that's going to be better. + +00:55:18.840 --> 00:55:25.820 +But I'm also a bit scared about a few things that mainly, as a few others have said now, + +00:55:26.500 --> 00:55:32.400 +third-party libraries extension is specifically those that are Python C extensions. + +00:55:33.320 --> 00:55:35.640 +just recently, I think like three weeks ago, + +00:55:36.140 --> 00:55:40.200 +so got a msgspec released for Python 3.14 + +00:55:40.410 --> 00:55:41.960 +and proper free threading support. + +00:55:42.620 --> 00:55:44.000 +And that took a lot of work. + +00:55:44.540 --> 00:55:47.760 +Fortunately, a few of the Python core devs + +00:55:47.800 --> 00:55:49.840 +chimed in and contributed to PRs + +00:55:49.920 --> 00:55:51.580 +and helped out with that. + +00:55:52.300 --> 00:55:54.380 +And all around the ecosystem, + +00:55:54.490 --> 00:55:55.240 +the last few years, + +00:55:55.340 --> 00:55:56.560 +there's been a lot of work going on. + +00:55:57.020 --> 00:56:00.720 +But especially for more niche libraries + +00:56:01.000 --> 00:56:02.660 +that are still here and there, + +00:56:02.880 --> 00:56:09.720 +I think there's still a lot to do and possibly also quite a few bugs lurking here and there + +00:56:09.860 --> 00:56:13.020 +that haven't been found or are really hard to track down. + +00:56:13.560 --> 00:56:18.380 +I'm curious and a bit maybe scarce, too hard of work, but I'm cautious. + +00:56:18.820 --> 00:56:21.880 +It's going to be a little bit of a bumpy ride as people turn that on + +00:56:22.080 --> 00:56:23.840 +and then the reality of what's happening. + +00:56:24.420 --> 00:56:27.920 +However, I want to take Cody's warning and turn it on its head + +00:56:28.300 --> 00:56:29.420 +about these third-party libraries, + +00:56:29.760 --> 00:56:38.820 +Because I think it's also an opportunity for regular Python developers who are not async fanatics to actually capture some of that capability. + +00:56:39.400 --> 00:56:47.880 +Say if some library says, hey, I realize that if we actually implement this lower level thing, you don't actually see the implementation of in true threading. + +00:56:48.220 --> 00:56:50.160 +And then you use it, but you don't actually do threading. + +00:56:50.160 --> 00:56:52.400 +You just call even a blocking function. + +00:56:52.880 --> 00:56:57.320 +You might get a huge performance boost, a little bit like David was talking about with markup safe. + +00:56:57.840 --> 00:57:00.040 +And you could just all of a sudden, with doing nothing + +00:57:00.600 --> 00:57:03.980 +with your code, goes five times faster on an eight core + +00:57:04.200 --> 00:57:06.680 +machine or something in little places where it used to matter. + +00:57:06.840 --> 00:57:08.960 +I'm super excited for-- + +00:57:09.410 --> 00:57:12.720 +we currently focused on the things that are out there + +00:57:12.900 --> 00:57:15.180 +right now and that might need to be updated. + +00:57:15.270 --> 00:57:19.260 +But I'm super excited for what else might come of this, + +00:57:19.620 --> 00:57:23.520 +new things that will be developed or stuff + +00:57:23.530 --> 00:57:25.940 +that we are currently not thinking about + +00:57:25.960 --> 00:57:29.620 +or have that hadn't been considered for the past 30 years or so, + +00:57:29.720 --> 00:57:33.240 +because it just wasn't feasible or wasn't possible or didn't make sense at all. + +00:57:33.720 --> 00:57:35.720 +I think it would pay off definitely. + +00:57:36.100 --> 00:57:36.940 +All right. Team Flask. + +00:57:37.420 --> 00:57:39.180 +You guys got the final word. + +00:57:39.440 --> 00:57:42.320 +I think it would probably be more advantageous to whiskey apps + +00:57:42.520 --> 00:57:43.900 +than it will for ASCII apps. + +00:57:44.140 --> 00:57:47.180 +And when I've been playing with it, it's mostly on the whiskey flask side + +00:57:47.280 --> 00:57:48.680 +where I'm quite excited about it. + +00:57:48.920 --> 00:57:52.220 +At the same time, like the others are a bit worried because not clear to me, + +00:57:52.380 --> 00:57:54.980 +for example, that green threading is going to work that well + +00:57:55.000 --> 00:57:56.040 +with free threading. + +00:57:56.400 --> 00:57:57.340 +And that may have been fixed, + +00:57:57.380 --> 00:57:58.520 +but I don't think it has yet. + +00:57:58.740 --> 00:58:01.480 +And that might then break a lot of whiskey apps. + +00:58:02.080 --> 00:58:03.240 +So next, I think. + +00:58:03.300 --> 00:58:05.760 +But yeah, very excited for Flask in particular. + +00:58:06.000 --> 00:58:07.260 +Thanks for bringing up green threading. + +00:58:07.260 --> 00:58:08.460 +I added that to my notes + +00:58:09.600 --> 00:58:11.020 +of mention right now. + +00:58:11.300 --> 00:58:14.360 +So Flask already has emphasized + +00:58:14.800 --> 00:58:15.980 +for years and years and years + +00:58:16.340 --> 00:58:17.840 +that don't store stuff globally, + +00:58:18.260 --> 00:58:19.020 +don't have global state, + +00:58:19.360 --> 00:58:21.220 +bind stuff to the request response cycle + +00:58:21.420 --> 00:58:22.880 +if you need to store stuff, + +00:58:23.100 --> 00:58:24.540 +look stuff up from a cache otherwise. + +00:58:24.600 --> 00:58:27.320 +And my impression is that that emphasis is pretty successful. + +00:58:27.330 --> 00:58:31.560 +I don't, there's any well-known extensions using like global state or anything like that. + +00:58:31.960 --> 00:58:35.580 +It's helped that the dev server that we have is threaded by default. + +00:58:36.160 --> 00:58:39.340 +Like it's not going for performance, obviously, it's just running on your local machine, but + +00:58:39.580 --> 00:58:42.680 +it's already like running in a threaded environment, running your application in a + +00:58:42.800 --> 00:58:45.160 +threaded environment, not a process-based one by default. + +00:58:45.740 --> 00:58:49.360 +I don't know if anybody even knows that you can run the dev server as process-based. + +00:58:49.720 --> 00:58:53.460 +And we also already had for a decade or more than a decade, + +00:58:53.960 --> 00:58:59.380 +Gevent to enable the exact same thing that free threading is enabling for Flask, + +00:58:59.520 --> 00:59:02.700 +which is concurrent work and connections. + +00:59:03.740 --> 00:59:06.600 +And so plenty of applications are already deployed that way + +00:59:06.880 --> 00:59:09.800 +using Gevent to do what kind of ASCII is enabling. + +00:59:10.400 --> 00:59:14.240 +I've run all the test suites with pytest FreeThreaded, + +00:59:14.560 --> 00:59:18.460 +which checks that your tests can run concurrently in the free threaded builds. + +00:59:18.920 --> 00:59:20.580 +so go check that out by Anthony Shaw. + +00:59:20.980 --> 00:59:23.160 +And I'm pretty sure Granian already supports free-threading. + +00:59:23.620 --> 00:59:25.400 +Not sure though, I haven't looked into Granian enough. + +00:59:25.480 --> 00:59:26.040 +But like- + +00:59:26.040 --> 00:59:27.040 +You know, I'm not sure either. + +00:59:27.040 --> 00:59:30.280 +It does have a runtime threaded mode + +00:59:30.280 --> 00:59:32.440 +but I don't know if that's truly free-threaded or not. + +00:59:32.440 --> 00:59:35.820 +All of those things combined make me pretty optimistic + +00:59:35.820 --> 00:59:38.840 +that Flask will be able to take advantage of this + +00:59:38.840 --> 00:59:40.540 +without much work from us. + +00:59:40.540 --> 00:59:42.240 +I mean, I know that's a big statement right there + +00:59:42.240 --> 00:59:43.480 +and I haven't tested it + +00:59:43.480 --> 00:59:45.160 +but the fact that we've emphasized + +00:59:45.160 --> 00:59:47.540 +all these different parts for so long already + +00:59:47.540 --> 00:59:48.340 +makes me confident about it. + +00:59:48.460 --> 00:59:49.660 +I'm also super excited about it. + +00:59:49.660 --> 00:59:51.680 +And just one final thought I'll throw out there + +00:59:51.790 --> 00:59:52.720 +before we call it a show, + +00:59:52.920 --> 00:59:54.640 +because we could go on for much longer, + +00:59:54.690 --> 00:59:55.260 +but we're out of time. + +00:59:55.580 --> 00:59:57.380 +I think once this comes along, + +00:59:57.860 --> 01:00:00.700 +whatever framework out of this choice you're using out there, + +01:00:01.080 --> 01:00:03.420 +there's a bunch of inner working pieces. + +01:00:04.400 --> 01:00:05.940 +One of them may have some kind of issue. + +01:00:06.130 --> 01:00:08.340 +And I think it's worth doing some proper load testing + +01:00:08.830 --> 01:00:12.040 +on your app, you know, point something like locus.io at it + +01:00:12.200 --> 01:00:14.900 +and just say, well, what if we gave it 10,000 concurrent users + +01:00:14.970 --> 01:00:15.360 +for an hour? + +01:00:15.390 --> 01:00:16.120 +Does it stop working? + +01:00:16.600 --> 01:00:17.160 +Does it crash? + +01:00:17.290 --> 01:00:18.140 +Or does it just keep going? + +01:00:18.360 --> 01:00:21.960 +You're like, so that seems like a pretty good thing to do the first time before you deploy + +01:00:22.100 --> 01:00:23.320 +your first free threaded version. + +01:00:23.840 --> 01:00:23.900 +Yeah. + +01:00:24.220 --> 01:00:24.620 +All right, everyone. + +01:00:25.440 --> 01:00:26.480 +I would love to talk somewhere. + +01:00:26.600 --> 01:00:30.020 +This is such a good conversation, but I also want to respect your time and all that. + +01:00:31.140 --> 01:00:32.220 +So thank you for being here. + +01:00:32.660 --> 01:00:35.320 +It's been an honor to get you all together and have this conversation. + +01:00:35.580 --> 01:00:37.000 +Thank you very much for having us. + +01:00:37.140 --> 01:00:37.400 +Thank you. + +01:00:37.640 --> 01:00:37.740 +Yeah. + +01:00:37.900 --> 01:00:38.600 +Thanks for having us all. + +01:00:38.760 --> 01:00:39.060 +Thanks, everybody. + +01:00:39.400 --> 01:00:39.480 +Yeah. + +01:00:39.700 --> 01:00:40.360 +It's nice being here. + +01:00:40.440 --> 01:00:40.580 +Yeah. + +01:00:40.700 --> 01:00:41.240 +Thanks for having us. + +01:00:41.480 --> 01:00:42.240 +Thanks for having us all. + +01:00:42.380 --> 01:00:42.540 +Bye. + +01:00:42.700 --> 01:00:43.060 +Bye-bye. + +01:00:44.380 --> 01:00:46.720 +This has been another episode of Talk Python To Me. + +01:00:47.020 --> 01:00:47.820 +Thank you to our sponsors. + +01:00:48.120 --> 01:00:49.320 +Be sure to check out what they're offering. + +01:00:49.510 --> 01:00:50.880 +It really helps support the show. + +01:00:51.300 --> 01:00:53.160 +If you or your team needs to learn Python, + +01:00:53.390 --> 01:00:56.780 +we have over 270 hours of beginner and advanced courses + +01:00:57.100 --> 01:01:00.520 +on topics ranging from complete beginners to async code, + +01:01:00.660 --> 01:01:03.440 +Flask, Django, HTMX, and even LLMs. + +01:01:03.660 --> 01:01:05.940 +Best of all, there's no subscription in sight. + +01:01:06.480 --> 01:01:08.260 +Browse the catalog at talkpython.fm. + +01:01:09.000 --> 01:01:10.940 +And if you're not already subscribed to the show + +01:01:11.160 --> 01:01:13.620 +on your favorite podcast player, what are you waiting for? + +01:01:14.240 --> 01:01:16.060 +Just search for Python in your podcast player. + +01:01:16.150 --> 01:01:17.040 +We should be right at the top. + +01:01:17.260 --> 01:01:20.260 +If you enjoyed that geeky rap song, you can download the full track. + +01:01:20.450 --> 01:01:22.280 +The link is actually in your podcast blog or share notes. + +01:01:23.180 --> 01:01:24.500 +This is your host, Michael Kennedy. + +01:01:24.900 --> 01:01:25.960 +Thank you so much for listening. + +01:01:26.170 --> 01:01:26.980 +I really appreciate it. + +01:01:27.400 --> 01:01:28.100 +I'll see you next time. + +01:01:38.260 --> 01:01:39.540 +I started to meet. + +01:01:40.900 --> 01:01:42.000 +And we're ready to roll. + +01:01:43.780 --> 01:01:44.820 +Upgrade the code. + +01:01:45.700 --> 01:01:47.320 +No fear of getting whole. + +01:01:48.820 --> 01:01:51.960 +We tapped into that modern vibe over King Storm. + +01:01:53.020 --> 01:01:55.700 +Talk Python To Me, I-Sync is the norm. + +01:02:24.880 --> 01:02:24.900 +Редактор субтитров А.Семкин Корректор А.Егорова + diff --git a/youtube_transcripts/532-python-2025-year-in-review-youtube.vtt b/youtube_transcripts/532-python-2025-year-in-review-youtube.vtt new file mode 100644 index 0000000..73c9c76 --- /dev/null +++ b/youtube_transcripts/532-python-2025-year-in-review-youtube.vtt @@ -0,0 +1,3884 @@ +WEBVTT + +00:00:00.760 --> 00:00:03.380 +Hey, everyone. It's so awesome to be here with you all. + +00:00:03.560 --> 00:00:05.840 +Thanks for taking the time out of your day to be part of + +00:00:05.980 --> 00:00:07.980 +Talk Python for this year in review, + +00:00:08.420 --> 00:00:09.780 +this Python year in review. + +00:00:10.840 --> 00:00:13.640 +Yeah, let's just jump right into it. Gregory. + +00:00:15.040 --> 00:00:16.920 +Hi. Welcome to the show. Welcome back to the show. + +00:00:17.660 --> 00:00:19.180 +Yeah. On a positive note, + +00:00:19.250 --> 00:00:20.360 +the last time I recorded, + +00:00:20.530 --> 00:00:23.600 +I had to record in my home in my closet, + +00:00:24.290 --> 00:00:26.140 +because the power was out throughout my city. + +00:00:26.920 --> 00:00:29.980 +Today, I'm actually back in my recording studio and glad to be + +00:00:30.000 --> 00:00:34.760 +to chat with people. I'm an associate professor of computer and information science, and I do + +00:00:35.080 --> 00:00:39.300 +research in software engineering and software testing. I've built a bunch of Python tools, + +00:00:39.620 --> 00:00:43.640 +and one of the areas we're studying now is flaky test cases in Python projects. + +00:00:44.420 --> 00:00:49.500 +I'm also really excited about teaching in a wide variety of areas. In fact, I use Python for + +00:00:49.720 --> 00:00:55.220 +operating systems classes or theory of computation classes. And one of the things I'm excited about + +00:00:55.240 --> 00:00:57.000 +is being a podcast host. + +00:00:57.520 --> 00:01:00.360 +I'm also a host on the Software Engineering Radio podcast + +00:01:00.860 --> 00:01:02.740 +sponsored by the IEEE Computer Society. + +00:01:03.560 --> 00:01:04.940 +And I've had the cool opportunity + +00:01:05.300 --> 00:01:06.700 +to interview a whole bunch of people + +00:01:06.810 --> 00:01:07.780 +in the Python community. + +00:01:08.300 --> 00:01:10.140 +So Michael, thanks for welcoming me to the show. + +00:01:10.640 --> 00:01:11.900 +Yeah, it's awesome to have you back. + +00:01:12.120 --> 00:01:14.020 +And we talked about FlakyTest last time. + +00:01:14.200 --> 00:01:18.020 +I do have to say your AV setup is quite good. + +00:01:18.600 --> 00:01:20.440 +I love the new mic and all that. + +00:01:22.719 --> 00:01:23.600 +Thomas, welcome. + +00:01:24.170 --> 00:01:24.580 +Awesome to have you. + +00:01:25.280 --> 00:01:34.600 +Thanks for having me. I'm Tomas Wauters. I'm a longtime Python core developer, although not as long as one of the other guests on this podcast. + +00:01:36.900 --> 00:01:46.560 +I worked at Google for 17 years. For the last year or so, I've worked at Meta. In both cases, I work on Python itself within the company and just deploying it internally. + +00:01:47.500 --> 00:01:51.880 +I've also been a board member of the PSF, although I'm not one right now. + +00:01:52.570 --> 00:01:59.000 +And I've been a steering council member for five years and currently not because of the + +00:01:59.160 --> 00:02:01.900 +elections are going and I don't know what the result is going to be. + +00:02:02.060 --> 00:02:08.800 +But I think there's like five, six chance that I'll be on the steering council since + +00:02:08.899 --> 00:02:14.320 +we only have six candidates for five positions when this episode probably airs. + +00:02:14.320 --> 00:02:14.700 +I don't know. + +00:02:15.720 --> 00:02:15.820 +Yeah. + +00:02:16.440 --> 00:02:16.580 +Amazing. + +00:02:17.200 --> 00:02:23.540 +that that's quite the contribution to the whole community thank you oh i i i always forget this + +00:02:23.540 --> 00:02:28.320 +i also got the uh what is it the distinguished service award from the psf this year i should + +00:02:28.440 --> 00:02:33.200 +probably mention that so yes i've been i have been recognized no need to talk about it further + +00:02:34.580 --> 00:02:38.460 +wonderful wonderful jody welcome back on the show awesome to catch up with you + +00:02:39.170 --> 00:02:43.999 +yeah thanks for having me back um so i am a data scientist and developer advocate + +00:02:44.000 --> 00:02:46.380 +at JetBrains working on PyCharm. + +00:02:47.060 --> 00:02:49.480 +And I've been a data scientist for around 10 years. + +00:02:50.180 --> 00:02:53.640 +And prior to that, I was actually a clinical psychologist. + +00:02:53.900 --> 00:02:56.640 +So that was my training, my PhD, + +00:02:57.040 --> 00:02:59.720 +but abandoned academia for greener pastures. + +00:03:00.040 --> 00:03:00.600 +Let's put it that way. + +00:03:01.620 --> 00:03:02.280 +Noah Franz Gregory. + +00:03:06.740 --> 00:03:07.420 +Brett, hello. + +00:03:08.660 --> 00:03:09.100 +Good to see you. + +00:03:09.300 --> 00:03:09.440 +Hello. + +00:03:10.060 --> 00:03:10.200 +Yes. + +00:03:11.080 --> 00:03:11.780 +Okay, let's see here. + +00:03:12.040 --> 00:03:13.740 +I've been at Microsoft for 10 years. + +00:03:15.219 --> 00:03:18.720 +Started working on AI R&D for Python developers. + +00:03:19.840 --> 00:03:22.140 +Also keep Wazzy running for Python here + +00:03:22.820 --> 00:03:24.640 +and do a lot of internal consulting for teams. + +00:03:25.560 --> 00:03:28.820 +Outside, I am actually the shortest running + +00:03:29.640 --> 00:03:31.000 +core developer on this call, amazingly, + +00:03:31.130 --> 00:03:32.640 +even though I've been doing it for 22 years. + +00:03:32.940 --> 00:03:34.920 +I've also only gotten the Frank Wilson Award, + +00:03:35.140 --> 00:03:35.760 +not the DSA. + +00:03:36.480 --> 00:03:39.020 +So I feel very underaccomplished here as a core developer. + +00:03:41.100 --> 00:03:43.140 +Yeah, that's me in a nutshell. + +00:03:44.240 --> 00:03:45.760 +I'm still trying to catch Anthony Shaw. + +00:03:46.620 --> 00:03:47.280 +Most quoted. + +00:03:47.760 --> 00:03:49.020 +Yeah, most quoted. + +00:03:49.500 --> 00:03:51.640 +I will say actually at work, it is in my email footer + +00:03:51.700 --> 00:03:53.500 +that I'm a famous Python quotationist. + +00:03:54.300 --> 00:03:56.220 +That was Anthony Shaw's suggestion, by the way. + +00:03:56.480 --> 00:03:57.200 +That was not mine. + +00:03:57.920 --> 00:04:00.660 +But it does link to the April Fool's joke from last year. + +00:04:01.760 --> 00:04:03.680 +And I am still trying to catch Anthony Shaw, I think, + +00:04:03.720 --> 00:04:04.800 +on appearances on this podcast. + +00:04:05.300 --> 00:04:06.300 +Well, plus one. + +00:04:07.020 --> 00:04:08.320 +Anthony Shaw shouldn't be here, honestly. + +00:04:08.700 --> 00:04:09.820 +I mean, I put it out into Discord. + +00:04:11.160 --> 00:04:13.560 +he could have been here, but probably at an odd time. + +00:04:17.900 --> 00:04:21.040 +You used to work on VS Code a bunch, + +00:04:21.040 --> 00:04:22.320 +on the Python aspect of VS Code. + +00:04:22.330 --> 00:04:24.280 +You recently changed roles, right? + +00:04:25.240 --> 00:04:26.080 +Not recently. + +00:04:26.240 --> 00:04:27.840 +That was, I used to be the dev manager. + +00:04:27.840 --> 00:04:29.340 +By the 70 years, years ago? + +00:04:31.240 --> 00:04:32.840 +Yeah, September of 2024. + +00:04:33.770 --> 00:04:34.440 +So it's been over a year. + +00:04:34.660 --> 00:04:36.220 +That counts as recent for me. + +00:04:37.840 --> 00:04:41.320 +Yes, I used to be the dev manager for the Python experience in VS Code. + +00:04:41.820 --> 00:04:42.180 +Okay. + +00:04:42.970 --> 00:04:43.220 +Very cool. + +00:04:43.420 --> 00:04:44.200 +That's quite a shift. + +00:04:45.120 --> 00:04:46.480 +Yeah, it went back to being an IC, basically. + +00:04:49.380 --> 00:04:51.080 +You're good at your TPS reports again now? + +00:04:53.100 --> 00:04:55.020 +Actually, I just did do my connect, so I kind of did. + +00:04:55.520 --> 00:04:55.840 +Awesome. + +00:04:56.599 --> 00:05:00.040 +Reuben, I bet you haven't filed a TPS report in at least a year. + +00:05:02.260 --> 00:05:03.120 +At least, at least. + +00:05:04.280 --> 00:05:04.940 +So yeah, I'm Reuben. + +00:05:05.820 --> 00:05:09.080 +I'm a freelance Python and Pandas trainer. + +00:05:09.600 --> 00:05:13.000 +I just celebrated this past week 30 years since going freelance. + +00:05:14.619 --> 00:05:16.480 +So I guess it's working out okay. + +00:05:18.300 --> 00:05:20.180 +We'll know at some point if I need to get a real job. + +00:05:21.120 --> 00:05:25.120 +So I teach Python Pandas both at companies and on my online platform. + +00:05:25.340 --> 00:05:26.060 +I have newsletters. + +00:05:26.200 --> 00:05:30.860 +I've written books, speak at conferences, and generally try to help people improve their + +00:05:31.280 --> 00:05:34.859 +Python and Pandas fluency and confidence and have a lot of fun with this community + +00:05:35.300 --> 00:05:36.080 +as well as with the language. + +00:05:36.760 --> 00:05:37.140 +Yeah, awesome. + +00:05:37.390 --> 00:05:38.120 +Oh, good to have you here. + +00:05:38.980 --> 00:05:41.160 +Barry, it's great to have a musician on the show. + +00:05:42.219 --> 00:05:42.580 +Thanks. + +00:05:43.880 --> 00:05:45.320 +Yeah, I've got my basses over here. + +00:05:45.620 --> 00:05:47.760 +So, you know, if you need to be serenaded. + +00:05:48.340 --> 00:05:50.820 +Yeah, like a Xenopython may break out at any moment. + +00:05:53.040 --> 00:05:53.880 +Good, good, good. + +00:05:54.719 --> 00:05:57.100 +Well, thanks for having me here. + +00:05:57.840 --> 00:06:01.820 +Yeah, I've been a core developer for a long time, since 1994. + +00:06:04.460 --> 00:06:14.500 +And I've been, you know, in the early days, I did tons of stuff for Python.org, worked with Guido at CNRI. + +00:06:14.990 --> 00:06:27.020 +And, you know, we moved everything from the mailing, you know, the Postmaster stuff and version control systems back in the day, websites, all that kind of stuff. + +00:06:27.090 --> 00:06:29.260 +I try to not do any of those things anymore. + +00:06:30.020 --> 00:06:33.200 +There's more, way more competent people doing that stuff now. + +00:06:35.680 --> 00:06:37.400 +I have been a release manager + +00:06:39.660 --> 00:06:41.540 +I'm currently back + +00:06:41.660 --> 00:06:42.720 +on the steering council + +00:06:44.020 --> 00:06:45.060 +and running again + +00:06:45.140 --> 00:06:47.200 +so between Thomas and I + +00:06:47.200 --> 00:06:48.900 +we'll see who makes it to + +00:06:49.320 --> 00:06:50.420 +six years I guess + +00:06:52.680 --> 00:06:55.060 +and I'm currently working for + +00:06:55.300 --> 00:06:57.000 +NVIDIA and I do + +00:06:57.040 --> 00:06:57.980 +all Python stuff + +00:06:59.440 --> 00:07:00.160 +some + +00:07:01.540 --> 00:07:03.280 +half and half roughly of + +00:07:03.300 --> 00:07:11.340 +things and external open source community work, both in packaging and in core Python. + +00:07:12.490 --> 00:07:14.520 +So that's, I guess, quick. + +00:07:15.200 --> 00:07:15.860 +Yeah, awesome. + +00:07:16.110 --> 00:07:16.920 +I think that's about it. + +00:07:17.520 --> 00:07:20.360 +Yeah, you all are living in exciting tech spaces. + +00:07:20.640 --> 00:07:21.220 +That's for sure. + +00:07:21.600 --> 00:07:22.040 +That's for sure. + +00:07:22.650 --> 00:07:23.040 +For sure. + +00:07:23.200 --> 00:07:23.280 +Yeah. + +00:07:23.650 --> 00:07:26.900 +Well, great to have you all back on the show. + +00:07:28.280 --> 00:07:29.320 +Let's start with our first topic. + +00:07:29.480 --> 00:07:37.920 +So the idea is we've each picked at least a thing that we think stood out in 2025 in the Python space that we can focus on. + +00:07:38.100 --> 00:07:41.740 +And, yeah, well, let's go with Jodi first. + +00:07:42.700 --> 00:07:45.340 +I'm excited to hear what you thought was one of the bigger things. + +00:07:46.520 --> 00:07:46.620 +Yeah. + +00:07:46.960 --> 00:07:49.360 +So I'm going to mention AI. + +00:07:49.960 --> 00:07:52.440 +Like, wow, what a, you know, big surprise. + +00:07:53.340 --> 00:07:59.700 +So to kind of give context of where I'm coming from, I've been working in NLP for a long time. + +00:07:59.750 --> 00:08:02.120 +I like to say I was working on LLMs before they were cool. + +00:08:03.620 --> 00:08:10.240 +So sort of playing around with the very first releases from Google in like 2019, incorporating that into search. + +00:08:11.280 --> 00:08:16.960 +So I've been very interested sort of seeing the unfolding of the GPT models as they've grown. + +00:08:17.820 --> 00:08:32.860 +And let's say I'm slightly disgusted by the discourse around the models as they become more mainstream, more sort of the talk about people's jobs being replaced, a lot of the hysteria, a lot of the doomsday stuff. + +00:08:33.080 --> 00:08:43.120 +So I've been doing talks and other content for around two and a half years now, just trying to cut through the hype a bit, being like, you know, they're just language models, they're good for language tasks. + +00:08:43.260 --> 00:08:46.080 +Let's think about realistically what they're about. + +00:08:46.460 --> 00:08:51.940 +And what was very interesting for me this year, I've been incorrectly predicting the bubble + +00:08:52.220 --> 00:08:53.920 +bursting for about two and a half years. + +00:08:54.420 --> 00:09:00.880 +So I was quite vindicated when in August GPT-5 came out and all of a sudden everyone else + +00:09:01.360 --> 00:09:02.800 +started saying, maybe this is a bubble. + +00:09:03.620 --> 00:09:03.740 +Yeah. + +00:09:04.060 --> 00:09:08.840 +So you think that was the first big release that was kind of a letdown compared to what + +00:09:08.840 --> 00:09:09.500 +the hype was? + +00:09:10.180 --> 00:09:10.360 +Yeah. + +00:09:10.500 --> 00:09:11.240 +And it was really interesting. + +00:09:11.560 --> 00:09:15.600 +So I found this really nice Atlantic article and I didn't save it, unfortunately. + +00:09:16.140 --> 00:09:22.520 +but essentially it told sort of the whole story of what was going on behind the scenes. So GPT-4 + +00:09:22.840 --> 00:09:29.120 +came out in March of 2023. And that was the model that came out with this Microsoft research paper + +00:09:29.340 --> 00:09:33.980 +saying, you know, sparks of AGI, artificial general intelligence, blah, blah, blah. And from that + +00:09:34.320 --> 00:09:43.019 +point, there was really this big expectation sort of fueled by open AI that GPT-5 was going to be + +00:09:42.980 --> 00:09:49.180 +AGI model. And it turns out what was happening internally is these scaling laws that were sort of + +00:09:49.520 --> 00:09:55.260 +considered, you know, this exponential growth thing that would sort of push the power of these models + +00:09:55.680 --> 00:10:01.940 +perhaps towards human-like performance. They weren't laws at all. And of course, they started failing. + +00:10:02.280 --> 00:10:07.180 +So the model that they'd originally pitched as GPT-4 just didn't live up to performance. They + +00:10:07.340 --> 00:10:11.920 +started this post-training stuff where they were going more into like specialized reasoning models. + +00:10:12.300 --> 00:10:16.100 +And what we have now are good models that are good for specific tasks. + +00:10:16.900 --> 00:10:22.460 +But I don't know what happened, but eventually they had to put the GPT-5 label on something. + +00:10:23.560 --> 00:10:27.920 +And yeah, let's say it didn't live up to expectations. + +00:10:28.340 --> 00:10:39.920 +So I think the cracks are starting to show because the underlying expectation always was this will be improving to the point where anything's possible. + +00:10:40.220 --> 00:10:42.380 +and you can't put a price on that. + +00:10:43.130 --> 00:10:46.340 +But it turns out that if maybe there's a limit on what's possible, + +00:10:47.300 --> 00:10:48.920 +yeah, you can put a price on it. + +00:10:49.620 --> 00:10:52.580 +And a lot of the valuations are on the first part. + +00:10:53.980 --> 00:10:56.180 +Yes, and it's always been a bit interesting to me + +00:10:56.400 --> 00:10:57.960 +because I come from a scientific background + +00:10:58.250 --> 00:11:00.260 +and you need to know how to measure stuff, right? + +00:11:00.900 --> 00:11:03.000 +And I'm like, what are you trying to achieve? + +00:11:03.520 --> 00:11:05.580 +Like Gregory's nodding, like please jump in. + +00:11:06.060 --> 00:11:09.000 +I'm on my monologue, so please don't interrupt me. + +00:11:10.980 --> 00:11:16.080 +you really need to understand what you're actually trying to get these models to do. What is AGI? + +00:11:16.580 --> 00:11:24.160 +No one knows this. And what's going to be possible with this? And it's more science fiction than fact. + +00:11:24.720 --> 00:11:28.980 +So this for me has been the big news this year. And I'm feeling slightly smug, + +00:11:29.180 --> 00:11:32.640 +I'm going to be honest, even though my predictions were off by about a year and a half. + +00:11:34.800 --> 00:11:38.900 +Yeah, maybe it's not an exponential curve. It's a titration S curve with an asymptote. + +00:11:40.080 --> 00:11:40.800 +Yeah, sigmoid. + +00:11:41.280 --> 00:11:41.420 +Yeah. + +00:11:41.660 --> 00:11:42.460 +Yeah, yeah, yeah. + +00:11:42.640 --> 00:11:46.720 +I mean, I think we have to sort of separate the technology from the business. + +00:11:47.370 --> 00:11:58.520 +And the technology, even if it doesn't get any better, even if we stay with what we have today, I still think this is like one of the most amazing technologies I've ever seen. + +00:11:59.120 --> 00:12:00.140 +It's not a god. + +00:12:00.500 --> 00:12:01.760 +It's not a panacea. + +00:12:02.400 --> 00:12:06.380 +But it's like a chainsaw that if you know how to use it, it's really effective. + +00:12:07.160 --> 00:12:11.000 +But in the hands of amateurs, you can really get hurt. + +00:12:11.620 --> 00:12:17.220 +And so, yes, it's great to see this sort of thing happening and improving, but who knows + +00:12:17.320 --> 00:12:17.820 +where it's going to go. + +00:12:17.920 --> 00:12:19.320 +And I'm a little skeptical of the AGI thing. + +00:12:19.440 --> 00:12:23.880 +What I'm a little more worried about is that these companies seem to have no possible way + +00:12:24.220 --> 00:12:27.300 +of ever making the money that they're promising to their investors. + +00:12:28.140 --> 00:12:36.900 +And I do worry a lot that we're sort of like a year 2000 situation where, yeah, the technology + +00:12:36.920 --> 00:12:47.320 +But the businesses are unsustainable. And out of the ashes of what will happen, we will get some amazing technology and even better than we had before. But there are going to be ashes. + +00:12:50.600 --> 00:13:14.880 +Yeah. For me, that also makes me worry. And I don't know if anyone reads Ed Zitron here. He's a journalist kind of digging into the state of the AI industry. He does get a bit sort of, he has a bit of a reputation as a bit of a crank now. So I think he's leaned into that pretty hard. But he does take the time to also pull out numbers and point out things that don't make sense. + +00:13:15.880 --> 00:13:18.320 +And he was one of the first ones to sound the whistle + +00:13:18.520 --> 00:13:20.540 +on this circular funding we've been seeing. + +00:13:20.880 --> 00:13:28.840 +So the worry, of course, is when a lot of this becomes borrowings + +00:13:29.640 --> 00:13:32.620 +from banks and then that starts dragging in funding + +00:13:33.020 --> 00:13:36.380 +from everyday people and also the effect that this has had + +00:13:36.460 --> 00:13:39.040 +on particularly the US economy, like the stock market. + +00:13:39.640 --> 00:13:43.620 +I think the investment in AI spending now exceeds consumer spending + +00:13:43.740 --> 00:13:49.320 +in the US, which is a really scary prospect. + +00:13:50.220 --> 00:13:50.780 +That is crazy. + +00:13:53.580 --> 00:13:56.960 +But yeah, also as Reven said, like I love LLMs. + +00:13:58.140 --> 00:14:00.160 +They are the most powerful tools we've ever had + +00:14:00.170 --> 00:14:01.420 +for natural language processing. + +00:14:02.120 --> 00:14:04.200 +It's phenomenal the problems we can solve with them now. + +00:14:04.400 --> 00:14:06.400 +Like I didn't think this sort of stuff would be possible + +00:14:07.070 --> 00:14:08.260 +when I started in data science. + +00:14:10.100 --> 00:14:12.140 +I still think there's a use case for agents, + +00:14:12.380 --> 00:14:15.260 +although I do think they've been a bit overstated, + +00:14:15.960 --> 00:14:17.440 +especially now that I'm building them. + +00:14:18.760 --> 00:14:21.280 +Let's say it's not very fun building non-deterministic software. + +00:14:21.420 --> 00:14:22.440 +It's quite frustrating, actually. + +00:14:24.420 --> 00:14:27.400 +But I hope we're going to see improvements in the framework, + +00:14:27.760 --> 00:14:30.080 +particularly I've heard good things about Pydantic AI. + +00:14:31.100 --> 00:14:34.220 +And yeah, hopefully we can control the input outputs + +00:14:34.680 --> 00:14:36.200 +and make them a bit more strict. + +00:14:36.460 --> 00:14:38.480 +This will fix a lot of the problems. + +00:14:39.720 --> 00:14:41.940 +One thing I do want to put out in this conversation, + +00:14:41.980 --> 00:14:45.520 +I think is worth separating. + +00:14:45.680 --> 00:14:47.100 +And Reuben, you touched on this some, + +00:14:47.600 --> 00:14:51.120 +but I want to suggest to you all, + +00:14:51.160 --> 00:14:52.400 +throw this out to you all and see what you think. + +00:14:53.260 --> 00:14:56.040 +I think it's very possible that this AI bubble + +00:14:56.920 --> 00:14:59.080 +crashes the economy and causes bad things + +00:14:59.380 --> 00:15:01.020 +economically to happen and a bunch of companies + +00:15:01.040 --> 00:15:05.440 +that are like wrappers over OpenAI, API, go away. + +00:15:07.000 --> 00:15:10.760 +But I don't think things like the agentic coding tools + +00:15:12.120 --> 00:15:12.840 +will vanish. + +00:15:13.180 --> 00:15:14.760 +They might stop training. They might slow their + +00:15:14.960 --> 00:15:15.960 +advance because that's super expensive. + +00:15:17.540 --> 00:15:18.620 +Even as you said, + +00:15:18.780 --> 00:15:20.600 +even if we just had + +00:15:20.740 --> 00:15:22.720 +Claude Sonnet 4 + +00:15:23.900 --> 00:15:24.860 +and the world + +00:15:25.100 --> 00:15:26.860 +never got something else, it would be so much + +00:15:27.440 --> 00:15:28.860 +farther beyond autocomplete + +00:15:29.270 --> 00:15:30.800 +and the other stuff that we had before and + +00:15:31.000 --> 00:15:32.080 +Stack Overflow that it's + +00:15:32.740 --> 00:15:34.740 +I don't think it's going to go. The reason I'm throwing this out there + +00:15:34.800 --> 00:15:36.760 +is I was talking to somebody and they were like, well, I don't think + +00:15:36.820 --> 00:15:38.940 +it's worth learning because I think the bubble is going to + +00:15:38.960 --> 00:15:42.600 +pop and so I don't want to learn this agent at coding because it won't be around very long. + +00:15:43.660 --> 00:15:44.020 +What do you all think? + +00:15:46.880 --> 00:15:50.600 +Yeah, I think it's here to stay. I think it's just + +00:15:51.020 --> 00:15:54.820 +where is the limit? Where does it stop potentially? I think that's the + +00:15:54.830 --> 00:15:58.860 +big open question for everybody, right? Pragmatically, it's a tool. + +00:15:59.480 --> 00:16:02.960 +It's useful in some scenarios and not in others. And you + +00:16:02.990 --> 00:16:06.680 +just have to learn how to use the tool appropriately for your use case and to get what you need out of it. + +00:16:06.780 --> 00:16:10.860 +And sometimes that's not using it because it's just going to take up more time than it will to be productive. + +00:16:10.940 --> 00:16:14.940 +But other times it's fully juices up your productivity and you can get more done. + +00:16:15.100 --> 00:16:16.360 +It's give and take. + +00:16:17.040 --> 00:16:21.940 +But I don't think it's going to go anywhere because as you said, Michael, there's even academics doing research now. + +00:16:22.140 --> 00:16:23.880 +There's open weight models as well. + +00:16:24.540 --> 00:16:35.100 +There's a lot of different ways to run this, whether you're at the scale of the frontier models that are doing these huge trainings or you're doing something local and more specialized. + +00:16:35.160 --> 00:16:38.540 +So I think the general use of AI isn't going anywhere. + +00:16:38.610 --> 00:16:49.600 +I think it's just a question of how far can this current trend go and where will it, I don't want to say stop, because that plays into the whole, it's going to completely go away. + +00:16:49.600 --> 00:16:50.280 +I don't think it ever will. + +00:16:50.290 --> 00:16:54.320 +I think it's just going to be where are we going to start to potentially bump up against limits. + +00:16:55.840 --> 00:17:00.080 +One thing that I'll say is that many of these systems are almost, to me, like a dream come true. + +00:17:00.800 --> 00:17:06.860 +Now, admittedly, it's the case that the systems I'm building are maybe only tens of thousands of lines or hundreds of thousands of lines. + +00:17:07.439 --> 00:17:26.500 +But I can remember thinking to myself, how cool would it be if I had a system that could automatically refactor and then add test cases and increase the code coverage and make sure all my checkers and linters pass and do that automatically and continue the process until it achieved its goal? + +00:17:27.360 --> 00:17:45.100 +And I remember thinking that five to seven years ago, I would never realize that goal in my entire lifetime. And now when I use like anthropics models through open code or cloud code, it's incredible how much you can achieve so quickly, even for systems that are of medium to moderate scale. + +00:17:45.700 --> 00:18:06.080 +So from my vantage point, it is a really exciting tool. It's incredibly powerful. And what I have found is that the LLMs are much better when I teach them how to use tools. And the tools that it's using are actually really quick, fast ones that can give rapid feedback to the LLM and tell it whether it's moving in the right direction or not. + +00:18:06.640 --> 00:18:13.400 +Yeah, there's an engineering angle to this. It's not just Vibe Coding if you take the time to learn it. + +00:18:14.840 --> 00:18:40.080 +There was actually a very interesting study. I don't think the study itself has been released. I haven't found it yet, but I saw a talk on it by some guys at Stanford. So they call it the 10K developer study. And basically what they were doing was studying real code bases, including I think 80% of them were actually private code bases, and seeing the point where the team started adopting AI. + +00:18:41.160 --> 00:18:43.720 +And so their findings are really interesting and nuanced. + +00:18:43.720 --> 00:18:48.520 +And I think they probably intuitively align with what a lot of us have experienced with AI. + +00:18:49.520 --> 00:19:00.840 +So basically, yes, there are productivity boosts, but it produces a lot of code, but the code tends to be worse than the code you would write and also introduces more bugs. + +00:19:01.180 --> 00:19:06.900 +So when you account for the time that you spend refactoring and debugging, you're still more productive. + +00:19:07.800 --> 00:19:10.900 +But then it also depends on the type of project, as Gregory was saying. + +00:19:11.180 --> 00:19:12.740 +So it's better for greenfield projects. + +00:19:13.260 --> 00:19:14.640 +It's better for smaller code bases. + +00:19:14.960 --> 00:19:16.020 +It's better for simpler problems. + +00:19:16.220 --> 00:19:21.020 +And it's better for more popular languages because, obviously, there's more training data. + +00:19:21.880 --> 00:19:24.860 +And so this was actually I like this study so much. + +00:19:24.960 --> 00:19:27.660 +I'll actually share it with you, Michael, if you want to put it in the show notes. + +00:19:27.880 --> 00:19:31.820 +But it shows that, yeah, the picture is not that simple. + +00:19:31.900 --> 00:19:37.120 +And all this conflicting information and conflicting experiences people were having line up completely with this. + +00:19:37.240 --> 00:19:42.060 +So again, like I work at an IDE company, it's tools for the job. + +00:19:42.620 --> 00:19:45.260 +It's not like your IDE will replace you. + +00:19:46.280 --> 00:19:47.500 +AI is not going to replace you. + +00:19:47.620 --> 00:19:50.460 +It's just going to make you maybe more productive sometimes. + +00:19:51.460 --> 00:19:51.580 +Yeah. + +00:19:51.820 --> 00:19:53.020 +Wait, IDE, you work for me. + +00:20:00.500 --> 00:20:05.040 +I mean, the other thing is a lot of people and a lot of the sort of when people talk about + +00:20:05.160 --> 00:20:13.180 +AI and LLMs and so forth in context of coding, it's the LLM writing code for us. And maybe because + +00:20:13.340 --> 00:20:19.160 +I'm not doing a lot of serious coding, it's more instruction, so forth. I use it as like a sparring + +00:20:19.480 --> 00:20:25.920 +or brainstorming partner. So it does checking of my newsletters for language and for tech edits + +00:20:26.400 --> 00:20:31.760 +and just sort of exploring ideas. And for that, maybe it's because I do everything in the last + +00:20:31.740 --> 00:20:35.060 +minute and I don't have other people around or I'm lazy or cheap and don't want to pay them. + +00:20:35.720 --> 00:20:39.840 +But definitely the quality of my work has improved dramatically. The quality of my understanding has + +00:20:40.000 --> 00:20:45.700 +improved, even if it never wrote a line of code for me. Just getting that feedback on a regular + +00:20:45.940 --> 00:20:51.140 +automatic basis is really helpful. Yeah, I totally agree with you. All right. We don't want to spend + +00:20:51.320 --> 00:20:57.300 +too much time on this topic, even though I believe Jody has put her finger on what might be the + +00:20:57.320 --> 00:21:02.360 +biggest tidal wave of 2025, but still a quick parting thoughts. + +00:21:02.480 --> 00:21:02.820 +Anyone else? + +00:21:04.140 --> 00:21:06.520 +I'm glad I'll never have the right bash from scratch ever again. + +00:21:08.820 --> 00:21:10.300 +Tell me about it. Yeah. + +00:21:12.240 --> 00:21:14.840 +I'll just, I'll just say from, from anecdotally, + +00:21:14.860 --> 00:21:20.440 +the thing that I love about it is when I need to do something and I need to go + +00:21:20.640 --> 00:21:25.800 +through docs, online docs for whatever it is, you know, + +00:21:25.960 --> 00:21:29.880 +It might be GitLab or some library that I want to use or something like that. + +00:21:30.760 --> 00:21:32.360 +I never even search for the docs. + +00:21:32.400 --> 00:21:34.260 +I never even try to read the docs anymore. + +00:21:34.380 --> 00:21:40.340 +I just say, hey, you know, whatever model I need to set up this website, + +00:21:41.120 --> 00:21:43.720 +and just tell me what to do or just do it. + +00:21:43.880 --> 00:21:47.920 +And it's an immense time saver and productivity. + +00:21:48.060 --> 00:21:52.580 +And then it gets me bootstrapped to the place where now I can start to be creative. + +00:21:52.920 --> 00:22:00.540 +I don't have to worry about just like digging through pages and pages and pages of docs to figure out one little setting here or there. + +00:22:01.220 --> 00:22:02.960 +That's an amazing time saver. + +00:22:03.760 --> 00:22:05.040 +Yeah, that's a really good point. + +00:22:05.320 --> 00:22:12.400 +Another thing that I have noticed, there might be many things for which I had a really good mental model, but my brain can only store so much information. + +00:22:13.080 --> 00:22:18.620 +So, for example, I know lots about the abstract syntax tree for Python, but I forget that sometimes. + +00:22:19.400 --> 00:22:24.400 +And so it's really nice for me to be able to bring that back into my mind quickly with an LLM. + +00:22:24.750 --> 00:22:33.040 +And if it's generating code for me that's doing a type of AST parsing, I can tell whether that's good code or not because I can refresh that mental model. + +00:22:33.500 --> 00:22:40.560 +So in those situations, it's not only the docs, but it's something that I used to know really well that I have forgotten some of. + +00:22:40.790 --> 00:22:47.160 +And the LLM often is very powerful when it comes to refreshing my memory and helping me to get started and move more quickly. + +00:22:49.620 --> 00:22:54.240 +All right. Out of time, I think. Let's move on to Brett. What do you got, Brett? + +00:22:56.780 --> 00:23:10.000 +Well, I actually originally said we should talk about AI, but Jody had a way better pitch for it than I did because my internal pitch was literally AI. Do I actually have to write a paragraph explaining why? Then Jody actually did write the paragraph, so she did a much better job than I did. + +00:23:10.560 --> 00:23:16.480 +So the other topic I had was using tools to run your Python code. + +00:23:16.930 --> 00:23:19.440 +And what I mean by that is traditionally, if you think about it, + +00:23:21.840 --> 00:23:23.280 +you install the Python interpreter, right? + +00:23:25.300 --> 00:23:27.560 +Hopefully you create a virtual environment, install your dependencies, + +00:23:27.960 --> 00:23:30.660 +and you call the Python interpreter in your virtual environment to run your code. + +00:23:31.080 --> 00:23:32.880 +And those are all the steps you went through to run stuff. + +00:23:33.510 --> 00:23:37.760 +But now we've got tools that will compress all that into a run command + +00:23:38.250 --> 00:23:39.060 +and just do it all for you. + +00:23:39.320 --> 00:23:48.800 +And it seems like the community has shown a level of comfort with that, that I'd say snuck up on me a little bit. + +00:23:49.200 --> 00:23:53.800 +But I would say that I think it's a good thing, right? + +00:23:53.860 --> 00:23:59.500 +It's showing us, I'm going to say us, as the junior core developer here on this call. + +00:24:00.940 --> 00:24:08.600 +As to, sorry to make you too feel old, but admittedly, Barry did write my letter of recommendation to my master's program. + +00:24:08.740 --> 00:24:12.840 +So Barry already knows this. + +00:24:13.140 --> 00:24:36.840 +Anyway, because I think what happened was like, yeah, we had Hatch and BDM and Poetry before that and uv as of last year all kind of come through and all kind of build on each other and take ideas from each other and kind of just slowly build up this kind of repertoire of tool approaches that they all kind of have a baseline kind of, not synergy is the right word, + +00:24:37.000 --> 00:24:39.960 +but share just kind of approach to certain things + +00:24:40.410 --> 00:24:43.060 +with their own twists and added takes on things. + +00:24:43.300 --> 00:24:45.760 +But in general, this whole like, you know what? + +00:24:45.830 --> 00:24:47.140 +You can just tell us to run this code + +00:24:47.230 --> 00:24:48.240 +and we will just run it, right? + +00:24:48.460 --> 00:24:50.060 +Like in-line script metadata coming in + +00:24:50.180 --> 00:24:52.060 +and help making that more of a thing. + +00:24:53.160 --> 00:24:55.080 +I just claim where I was the PEP delegate + +00:24:55.400 --> 00:24:56.240 +for getting that in. + +00:24:57.960 --> 00:25:00.180 +But I just think that's been a really awesome trend + +00:25:01.500 --> 00:25:05.160 +and I'm hoping we can kind of leverage that a bit. + +00:25:05.300 --> 00:25:08.040 +I have personal plans that we don't need to go into here. + +00:25:08.700 --> 00:25:10.520 +I'm hoping as a Python core team, + +00:25:10.530 --> 00:25:12.880 +we can help boost this stuff up a bit + +00:25:12.880 --> 00:25:15.160 +and help keep a good baseline for this for everyone. + +00:25:15.300 --> 00:25:18.000 +I think it's shown that Python is still really good for beginners. + +00:25:18.270 --> 00:25:21.780 +You just have to give them the tools to hide some of the details + +00:25:21.990 --> 00:25:23.200 +to not shoot yourself in the foot. + +00:25:24.210 --> 00:25:25.000 +It's a great outcome. + +00:25:26.320 --> 00:25:29.700 +2025 might be the year that the Python tools stepped outside of Python. + +00:25:30.230 --> 00:25:34.420 +Instead of being you install Python and then use the tools, + +00:25:35.140 --> 00:25:36.780 +You do the tool to get Python, right? + +00:25:37.080 --> 00:25:38.620 +Like uv and PDM and others. + +00:25:38.820 --> 00:25:40.640 +Yeah, and inverted the dependency graph + +00:25:40.760 --> 00:25:42.960 +in terms of just how you put yourself in, right? + +00:25:43.360 --> 00:25:45.800 +I think the interesting thing is these tools treat Python + +00:25:46.200 --> 00:25:48.860 +as an implementation detail almost, right? + +00:25:49.640 --> 00:25:53.260 +When you just say uv or hatch run or PDM run thing, + +00:25:55.580 --> 00:25:57.280 +these tools don't make you have to think about the interpreter. + +00:25:57.500 --> 00:26:00.420 +It's just a thing that they pull in to make your code run, right? + +00:26:00.540 --> 00:26:03.180 +It's not even necessarily something you have to care about + +00:26:03.180 --> 00:26:03.880 +if you choose not to. + +00:26:04.060 --> 00:26:08.540 +And it's an interesting shift in that perspective, at least for me. + +00:26:08.670 --> 00:26:10.000 +But I've also been doing this for a long time. + +00:26:10.780 --> 00:26:13.260 +Yeah, I think you're really on to something. + +00:26:13.570 --> 00:26:16.260 +And what I love at sort of a high level is this, + +00:26:16.770 --> 00:26:19.740 +I think there's a renewed focus on the user experience. + +00:26:21.660 --> 00:26:27.500 +And like uv plus the PEP 723, the inline metadata, + +00:26:27.870 --> 00:26:31.240 +you can put uv in the shebang line of your script. + +00:26:31.980 --> 00:26:34.360 +And now you don't have to think about anything. + +00:26:34.660 --> 00:26:36.680 +You know, you get uv from somewhere + +00:26:37.400 --> 00:26:39.540 +and then it takes care of everything. + +00:26:40.720 --> 00:26:42.840 +And Hatch can work the same way for, + +00:26:43.280 --> 00:26:44.000 +I think for developers, + +00:26:44.280 --> 00:26:49.880 +but like this renewed focus on getting, + +00:26:50.340 --> 00:26:52.580 +like installing your Python executable. + +00:26:52.920 --> 00:26:54.320 +Like you don't really have to think about, + +00:26:55.060 --> 00:26:56.820 +because those things are very complicated + +00:26:56.870 --> 00:26:59.220 +and people just want to kind of hit the ground running. + +00:26:59.840 --> 00:27:01.940 +And so if you think about the previous discussion + +00:27:01.960 --> 00:27:06.920 +about AI, right? Like, like I just want things to work. I have, I know what I want to do. + +00:27:07.630 --> 00:27:15.080 +I can see it. I can see the vision of it. And I just don't want to. And an analogy is like when + +00:27:15.080 --> 00:27:21.060 +I first learned Python and I came from C++ and all those languages. And I thought, oh my gosh, + +00:27:21.210 --> 00:27:27.120 +just to get like, hello world, I have to do a million little things that I shouldn't have to do, + +00:27:27.480 --> 00:27:35.160 +Like create a main and get my braces right and get all my variables right and get my pound includes correct. + +00:27:35.280 --> 00:27:37.360 +And now I don't have to think about any of that stuff. + +00:27:37.820 --> 00:27:49.960 +And the thing that was eye-opening for me with Python was the distance between vision of what I wanted and working code just really narrowed. + +00:27:49.980 --> 00:27:56.580 +And I think that as we are starting to think about tools and environments and how to bootstrap all this stuff, + +00:27:56.800 --> 00:28:01.840 +We're also now taking all that stuff away because people honestly don't care. + +00:28:01.960 --> 00:28:03.620 +I don't care about any of that stuff. + +00:28:03.660 --> 00:28:07.540 +I just want to go from like, oh, I woke up this morning and had a cool idea + +00:28:07.620 --> 00:28:09.600 +and I just wanted to get it working. + +00:28:10.420 --> 00:28:12.660 +Or you wanted to share it so you could just share the script + +00:28:12.780 --> 00:28:15.880 +and you don't have to say, here's your steps that you get started with. + +00:28:16.899 --> 00:28:17.700 +Exactly, exactly. + +00:28:19.340 --> 00:28:22.220 +I want to thank the two of you for – I'm sorry, go ahead. + +00:28:22.940 --> 00:28:29.080 +I'm just going to say, like, for years teaching Python that how do we get it installed? + +00:28:29.980 --> 00:28:32.940 +At first, it surprised me how difficult it was for people. + +00:28:33.100 --> 00:28:34.620 +Because like, oh, come on, we just got Python. + +00:28:34.800 --> 00:28:35.740 +Like, what's so hard about this? + +00:28:35.920 --> 00:28:40.400 +But it turns out it's a really big barrier to entry for newcomers. + +00:28:40.880 --> 00:28:44.740 +And I'm very happy that Jupyter Lite now has solved its problems with input. + +00:28:45.140 --> 00:28:46.380 +And it's like huge. + +00:28:47.020 --> 00:28:53.080 +But until now, I hadn't really thought about starting with uv because it's cross-platform. + +00:28:53.470 --> 00:29:02.180 +And if I say to people in the first 10 minutes of class, install uv for your platform, and then say uv in it, your project, bam, you're done. + +00:29:02.330 --> 00:29:03.220 +It just works. + +00:29:03.490 --> 00:29:04.280 +And then it works cross-platform. + +00:29:04.660 --> 00:29:07.180 +This is mind-blowing, and I'm going to try this at some point. + +00:29:07.290 --> 00:29:07.640 +Thank you. + +00:29:08.900 --> 00:29:16.460 +I can comment on the mind-blowing part because now when I teach undergraduate students, we start with uv in the very first class, and it is awesome. + +00:29:17.000 --> 00:29:31.960 +There were things that would take students, even very strong students who've had lots of experience, it would still take them a week to set everything up on their new laptop and get everything ready and to understand all the key concepts and know where something is in their path. + +00:29:32.140 --> 00:29:36.460 +that now we just say install uv for your operating system + +00:29:37.140 --> 00:29:39.500 +and get running on your computer + +00:29:39.920 --> 00:29:41.180 +and then, hey, you're ready to go. + +00:29:41.590 --> 00:29:43.700 +And I don't have to teach them about Docker containers + +00:29:43.950 --> 00:29:45.960 +and I don't have to tell them how to install Python + +00:29:46.090 --> 00:29:47.260 +with some packet manager. + +00:29:47.900 --> 00:29:49.660 +All of those things just work. + +00:29:50.120 --> 00:29:51.900 +And I think from a learning perspective, + +00:29:52.210 --> 00:29:55.120 +whether you're in a class or whether you're in a company + +00:29:55.450 --> 00:29:56.600 +or whether you're teaching yourself, + +00:29:57.420 --> 00:29:59.060 +UV is absolutely awesome. + +00:30:00.780 --> 00:30:05.600 +I'm actually wondering whether I am the one who is newest to Python here. + +00:30:05.880 --> 00:30:08.820 +I taught myself Python in 2011. + +00:30:09.740 --> 00:30:12.300 +So I was like Python 2.7 stage. + +00:30:14.040 --> 00:30:15.380 +But it was my first programming language. + +00:30:15.600 --> 00:30:19.080 +I was just procrastinating during my PhD and I was like, I should learn to program. + +00:30:19.860 --> 00:30:21.820 +So I just taught myself Python. + +00:30:22.480 --> 00:30:26.260 +And I can tell you, you do not come from an engineering background. + +00:30:26.720 --> 00:30:27.840 +And you're like, what is Python? + +00:30:28.280 --> 00:30:29.340 +What is Python doing? + +00:30:29.740 --> 00:30:33.320 +Why am I typing Python to execute this hello world? + +00:30:33.380 --> 00:30:39.840 +And if you're kind of curious, you get down a rabbit hole before you even get to the point where you're just focusing on learning the basics. + +00:30:40.900 --> 00:30:54.780 +And so it's exactly, I was going to say with Reuven, like whether you thought about it for teaching, because we're now debating for humble data, which is a beginner's data science community that I'm part of, whether we switch to uv. + +00:30:54.860 --> 00:31:01.800 +this was Chuck's idea because it does abstract away all these details. The debate I have is, + +00:31:02.040 --> 00:31:06.400 +is it too magic? This is kind of the problem because I also remember learning about things + +00:31:06.520 --> 00:31:10.260 +like virtual environments because again, this was my first programming language and being like, + +00:31:10.620 --> 00:31:15.980 +oh, it's a very good idea. This is best practices. And it's also a debate we have in PyCharm, right? + +00:31:16.280 --> 00:31:24.340 +Like how much do you magic away the fundamentals versus making people think a little bit, but + +00:31:24.240 --> 00:31:24.980 +I'm not sure. + +00:31:25.600 --> 00:31:28.540 +Would you even let somebody run without a virtual environment? + +00:31:28.740 --> 00:31:31.020 +That's a stance you could take. + +00:31:31.220 --> 00:31:36.060 +I used to when I first learned Python because it was too complicated. + +00:31:36.520 --> 00:31:38.260 +But then I learned better. + +00:31:38.940 --> 00:31:39.460 +But yes. + +00:31:40.160 --> 00:31:52.480 +I think the consideration here is like hiding the magic isn't like hiding the details and having all this magic just work is great as long as it works. + +00:31:53.320 --> 00:32:01.360 +And the question is, how is it going to break down and how are people going to know how to deal when it breaks down if you hide all the magic? + +00:32:02.160 --> 00:32:16.380 +And I think virtual ends were, or let's say before we had virtual ends, installing packages was very much in the, you had to know all the details because it was very likely going to break down in some way right before we had virtual ends. + +00:32:16.940 --> 00:32:22.440 +Because you would end up with weird conflicts or multiple copies of a package installed in different parts of the system. + +00:32:23.420 --> 00:32:28.180 +when we got virtual ends, we, we sort of didn't have to worry about that anymore + +00:32:28.360 --> 00:32:32.440 +because we were trained in that you can just blow away the virtual one and it just works. + +00:32:33.260 --> 00:32:36.820 +And with uv, we're back into, this looks like a single installation. + +00:32:37.180 --> 00:32:42.320 +We don't know what's going to go on, but we've learned we as a community and also the people + +00:32:42.520 --> 00:32:48.460 +working on, on uv, we have learned from those earlier mistakes or not, maybe not mistakes, + +00:32:48.560 --> 00:32:55.320 +but consequences of the design and they have created something that is that appears to be + +00:32:55.520 --> 00:33:01.480 +very stable where it's unlikely the magic will break and when the magic does break it's obvious + +00:33:01.760 --> 00:33:08.520 +what the problem is or or it automatically fixes itself so like it's not reusing uh broken uh + +00:33:08.840 --> 00:33:15.400 +installations and that kind of thing so the risk now as it turns out i think as is proven by the + +00:33:15.360 --> 00:33:22.940 +community adopting uv so fast and so willingly, I think it's acceptable. I think it's, yeah, + +00:33:22.950 --> 00:33:28.680 +I think it's proven itself. It's clear that this is, it's worth the potential of discovering + +00:33:29.740 --> 00:33:38.140 +weird edge cases later, both because it's probably low likelihood, but also the people behind uv, + +00:33:38.440 --> 00:33:43.019 +Astral, have proven that they would jump in and fix those issues, right? They would do anything + +00:33:43.060 --> 00:33:47.140 +they need to keep uv workable the same way. + +00:33:47.690 --> 00:33:51.260 +And they have a focus that Python as a whole cannot have + +00:33:51.560 --> 00:33:53.720 +because they cater to fewer use cases + +00:33:54.020 --> 00:33:55.900 +than Python as a whole needs to. + +00:33:56.600 --> 00:33:56.720 +Yeah. + +00:33:56.910 --> 00:34:00.380 +And on the audience, Galano says, + +00:34:00.580 --> 00:34:02.580 +as an enterprise tech director and Python coder, + +00:34:02.580 --> 00:34:04.040 +I believe we should hide the magic + +00:34:04.170 --> 00:34:06.980 +which empowers the regular employee to do simple things + +00:34:06.990 --> 00:34:08.000 +that make their job easier. + +00:34:08.340 --> 00:34:08.419 +Yeah. + +00:34:09.460 --> 00:34:09.679 +Yeah. + +00:34:10.940 --> 00:34:11.780 +I'll go ahead, Barry. + +00:34:12.600 --> 00:34:21.040 +No, I think this notion of abstractions, right, has always been there in computer science. + +00:34:22.250 --> 00:34:33.500 +And, you know, we've used tools or languages or systems where we've tried to bring that abstraction layer up so that we don't have to think about all these details, as I mentioned before. + +00:34:34.840 --> 00:34:38.100 +The question is, that's always the happy path. + +00:34:38.280 --> 00:34:47.000 +And when I'm trying to teach somebody something like, here's how to use this library or here's how to use this tool, I try to be very opinionated to keep people on that happy path. + +00:34:47.120 --> 00:34:50.520 +Like, assume everything's going to work just right. + +00:34:51.220 --> 00:34:55.520 +Here's how you just make you go down that path to get the thing done that you want. + +00:34:55.520 --> 00:35:04.980 +The question really is, when things go wrong, how narrow is that abstraction? + +00:35:05.300 --> 00:35:07.680 +And are you able, and even when you're just curious, + +00:35:08.200 --> 00:35:10.640 +like what's really going on underneath the hood? + +00:35:11.100 --> 00:35:12.640 +Of course, that's not a really good analogy today + +00:35:12.820 --> 00:35:15.480 +because cars are basically computers on wheels + +00:35:15.700 --> 00:35:17.480 +that you can't really understand how they work. + +00:35:19.760 --> 00:35:20.860 +But back in your day. + +00:35:21.560 --> 00:35:25.060 +But back in my day, we were changing spark plugs, you know. + +00:35:26.220 --> 00:35:27.740 +And crank that window down. + +00:35:28.660 --> 00:35:29.140 +Exactly. + +00:35:30.600 --> 00:35:33.900 +So I think we always have to leave that room + +00:35:33.920 --> 00:35:36.840 +for the curious and the bad path + +00:35:37.380 --> 00:35:38.900 +where when things go wrong + +00:35:39.140 --> 00:35:40.120 +or when you're just like, + +00:35:40.420 --> 00:35:41.980 +you know what, I understand how this works, + +00:35:42.010 --> 00:35:45.380 +but I'm kind of curious about what's really going on. + +00:35:45.580 --> 00:35:47.780 +How easy is it for me to dive in + +00:35:47.960 --> 00:35:50.220 +and get a little bit more of that background, + +00:35:50.610 --> 00:35:53.260 +you know, a little bit more of that understanding + +00:35:53.440 --> 00:35:54.160 +of what's going on. + +00:35:55.140 --> 00:35:59.200 +Yeah, I want the magic to decompose, right? + +00:35:59.400 --> 00:36:01.339 +Like you should be able to explain the magic path + +00:36:01.360 --> 00:36:03.940 +via more decomposed steps using the tool + +00:36:04.090 --> 00:36:05.700 +all the way down to what the tools actually do + +00:36:05.940 --> 00:36:06.400 +behind the scenes. + +00:36:08.860 --> 00:36:11.640 +Just to admit, the reason I brought this up + +00:36:11.640 --> 00:36:12.980 +and I've been thinking about this a lot + +00:36:13.120 --> 00:36:15.120 +is I'm thinking of trying to get the Python launcher + +00:36:15.840 --> 00:36:17.240 +to do a bit more. + +00:36:17.580 --> 00:36:20.240 +Because one interesting thing we haven't really brought up here + +00:36:20.250 --> 00:36:21.800 +is we're all seeing uv, uv, uv. + +00:36:22.620 --> 00:36:23.400 +UV is a company. + +00:36:24.440 --> 00:36:25.480 +There's always the risk they might disappear. + +00:36:26.280 --> 00:36:28.600 +And we haven't de-risked ourselves from that. + +00:36:28.740 --> 00:36:29.480 +Now, we do have Hatch. + +00:36:29.490 --> 00:36:30.180 +We do have PDM. + +00:36:31.200 --> 00:36:34.120 +But as I said, there's kind of a baseline I think they all share + +00:36:34.170 --> 00:36:37.460 +that I think they would probably be okay if the Python launcher just did + +00:36:37.550 --> 00:36:38.940 +because that's based on standards, right? + +00:36:39.060 --> 00:36:40.960 +Because that's the other thing that there's been a lot of work + +00:36:40.970 --> 00:36:42.320 +that has led to this step, right? + +00:36:42.480 --> 00:36:44.860 +Like we've gotten way more packaging standards. + +00:36:45.140 --> 00:36:47.820 +We've got PEP 723 like we mentioned. + +00:36:48.380 --> 00:36:50.720 +There's a lot of work that's come up to lead to this point + +00:36:51.000 --> 00:36:54.780 +that all these tools can lean on to have them all have an equivalent outcome + +00:36:55.060 --> 00:36:57.320 +because it's expected as how they should be. + +00:36:58.519 --> 00:37:01.160 +And so I think it's something we need to consider + +00:37:01.180 --> 00:37:03.160 +of how do we make sure, + +00:37:03.660 --> 00:37:04.160 +like, by the way, + +00:37:04.840 --> 00:37:05.880 +I know the people, + +00:37:06.080 --> 00:37:06.360 +they're great. + +00:37:06.380 --> 00:37:07.780 +I'm not trying to spare them + +00:37:07.960 --> 00:37:08.800 +or think they're going to go away, + +00:37:09.000 --> 00:37:09.800 +but it is something + +00:37:09.860 --> 00:37:10.520 +we have to consider. + +00:37:11.340 --> 00:37:12.700 +And I will also say, Jody, + +00:37:14.120 --> 00:37:15.460 +I do think about this for teaching + +00:37:15.760 --> 00:37:17.160 +because I'm a dad now + +00:37:17.820 --> 00:37:19.320 +and I don't want my kid coming home + +00:37:20.120 --> 00:37:21.060 +when they get old enough + +00:37:21.160 --> 00:37:22.040 +to learn Python and go, + +00:37:22.160 --> 00:37:22.660 +hey, dad, + +00:37:22.820 --> 00:37:24.020 +why is getting Python code + +00:37:24.140 --> 00:37:24.920 +running so hard? + +00:37:25.280 --> 00:37:26.340 +So I want to make sure + +00:37:26.520 --> 00:37:27.380 +that that never happens. + +00:37:28.300 --> 00:37:29.660 +So it is for me. + +00:37:29.880 --> 00:37:30.620 +From the start. + +00:37:31.000 --> 00:37:42.940 +Yeah. I realized something for the 2026 year interview, I have to bring a sign that says time for next topic because we got a bunch of topics and we're running low on time. So Thomas, let's jump over to yours. + +00:37:43.680 --> 00:37:50.560 +Oh, and I have two topics as well. So I'm only going to have to pick my favorite child, right? That's terrible. + +00:37:52.620 --> 00:37:59.480 +Yeah. So my second favorite child is Lazy Imports, which is a relatively new development. So we'll probably not get to that. + +00:38:00.090 --> 00:38:00.220 +And just accepted. + +00:38:00.580 --> 00:38:03.220 +And yes, it's been accepted and it's going to be awesome. + +00:38:03.960 --> 00:38:07.840 +So I'll just give that a shout out and then move to my favorite child, which is free-threaded Python. + +00:38:09.540 --> 00:38:14.280 +For those who were not aware, the global interpreter lock is going away. + +00:38:14.770 --> 00:38:16.840 +I am stating it as a fact. + +00:38:17.020 --> 00:38:22.600 +It's not actually a fact yet, but that's because the steering council hasn't realized the fact yet. + +00:38:24.120 --> 00:38:25.280 +It is trending towards. + +00:38:26.160 --> 00:38:26.280 +Yeah. + +00:38:26.660 --> 00:38:35.100 +Well, so I was originally on the steering council that accepted the proposal to add free threading as a, as an experimental feature. + +00:38:35.250 --> 00:38:41.960 +We had this idea of adding it as experimental and then making it supported, but not the default and then making it the default. + +00:38:42.720 --> 00:38:45.260 +And it was all a little vague and, and up in the air. + +00:38:46.360 --> 00:38:51.900 +And then I didn't get reelected for the steering council last year, which I was not sad about at all. + +00:38:52.100 --> 00:38:55.040 +I sort of ran on a, well, if there's nobody better, I'll do it. + +00:38:55.160 --> 00:38:57.200 +but otherwise I have other things to do. + +00:38:57.820 --> 00:39:01.460 +And it turns out those other things were making sure that pre for that Python + +00:39:01.720 --> 00:39:03.060 +landed in a supported state. + +00:39:03.160 --> 00:39:07.600 +So I lobbied the steering council quite hard as Barry might remember at the + +00:39:07.760 --> 00:39:10.840 +start of the year to, to get some movement on this, + +00:39:10.940 --> 00:39:12.220 +like get some decision going. + +00:39:12.940 --> 00:39:15.620 +So for Python 3.14, it is officially supported. + +00:39:16.100 --> 00:39:17.320 +The performance is great. + +00:39:17.580 --> 00:39:21.740 +It's like between a couple of percent slower and 10% slower, + +00:39:22.020 --> 00:39:24.600 +depending on the hardware and the compiler that you use. + +00:39:25.180 --> 00:39:35.640 +It's basically the same speed on macOS, which is really like it's that's a combination of the ARM hardware and Clang specializing things. + +00:39:35.820 --> 00:39:37.160 +But it's basically the same speed. + +00:39:38.020 --> 00:39:38.240 +Wow. + +00:39:39.220 --> 00:39:43.060 +And then on recent GCCs on Linux, it's like a couple of percent slower. + +00:39:44.220 --> 00:40:01.860 +So the main problem is really community adoption, getting third-party packages to update their extension modules for the new APIs and the things that by necessity sort of broke, and also supporting free threading in a good way in packages. + +00:40:02.510 --> 00:40:09.040 +For Python code, it turns out there's very few changes that need to be made for things to work well under free threading. + +00:40:09.520 --> 00:40:21.620 +They might not be entirely thread safe, but usually, like almost always in cases where it wasn't thread safe before either, because the GIL doesn't actually affect thread safety, just the likelihood of things breaking. + +00:40:22.560 --> 00:40:32.920 +I do think there's been a bit of a, the mindset of the Python community hasn't really been focused on creating thread safe code because the GIL is supposed to protect us. + +00:40:32.990 --> 00:40:36.700 +But soon as it takes multiple steps, then all of a sudden it's just less likely. + +00:40:36.920 --> 00:40:38.080 +It's not that it couldn't happen. + +00:40:39.060 --> 00:40:39.900 +Yeah, that's my point, right? + +00:40:40.100 --> 00:40:42.420 +It's not, the GIL never gave you threat safety. + +00:40:42.550 --> 00:40:46.000 +The GIL gave CPython's internals threat safety. + +00:40:46.340 --> 00:40:53.100 +It never really affected Python code, and it very rarely affected threat safety in extension modules as well. + +00:40:53.230 --> 00:41:01.060 +So they already had to take care of making sure that the global interpreter law couldn't be released by something that they ended up calling indirectly. + +00:41:02.440 --> 00:41:06.420 +So it's actually not that hard to port most things to support free threading. + +00:41:07.640 --> 00:41:12.960 +And the benefits, we've seen some experimental work because, you know, it's still new. + +00:41:13.200 --> 00:41:15.800 +There's still a lot of things that don't quite support it. + +00:41:15.940 --> 00:41:19.860 +There's still places where thread contention slows things down a lot. + +00:41:20.300 --> 00:41:29.560 +But we've seen a lot of examples of really promising, very parallel problems that now speed up by 10x or more. + +00:41:30.360 --> 00:41:32.720 +And it's going to be really excited in the future. + +00:41:32.940 --> 00:41:35.360 +And it's in 2025 that this all started. + +00:41:35.700 --> 00:41:37.260 +I mean, Sam started it earlier. + +00:41:37.360 --> 00:41:42.320 +But, you know, he's been working on this for years, but it landed in 2025. + +00:41:42.900 --> 00:41:45.940 +It dropped its experimental stage in 314, basically. + +00:41:46.560 --> 00:41:46.940 +Yes. + +00:41:48.180 --> 00:41:49.320 +I totally agree. + +00:41:50.760 --> 00:41:51.060 +Go ahead. + +00:41:51.880 --> 00:41:57.720 +I was going to say, were we all, the three of us, on the steering council at the same time when we decided to start the experiment for free threading? + +00:41:58.080 --> 00:41:59.460 +I think Barry wasn't on it. + +00:42:00.120 --> 00:42:04.160 +Yeah, I missed a couple of years there, but I'm not sure. + +00:42:05.580 --> 00:42:08.120 +But, you know, I totally agree. + +00:42:08.120 --> 00:42:11.080 +I think free threading is one of the most transformative + +00:42:12.120 --> 00:42:15.220 +developments for Python, certainly since Python 3, + +00:42:15.460 --> 00:42:18.260 +but even maybe more impactful because of the size + +00:42:18.270 --> 00:42:19.660 +of the community today. + +00:42:22.559 --> 00:42:25.940 +Personally, you know, not necessarily speaking + +00:42:26.120 --> 00:42:30.400 +as a current or potentially former steering council member, + +00:42:30.540 --> 00:42:34.700 +we'll see how that shakes out, but I think it's inevitable. + +00:42:35.760 --> 00:42:43.600 +I think free threading is absolutely the future of Python, and I think it's going to unlock incredible potential and performance. + +00:42:45.420 --> 00:42:46.860 +I think we just have to do it right. + +00:42:47.040 --> 00:42:52.240 +And so I talked to lots of teams who are building various software all over the community. + +00:42:54.460 --> 00:43:03.300 +And I actually think it's more of an educational and maybe an outreach problem than it is a technological problem. + +00:43:03.440 --> 00:43:08.520 +I mean, yes, there are probably APIs that are missing that will make people's lives easier. + +00:43:10.020 --> 00:43:16.220 +There's probably some libraries that will make other code a little easier to write or whatever or to understand. + +00:43:17.000 --> 00:43:18.180 +But like all that's solvable. + +00:43:18.320 --> 00:43:27.220 +And I think really reaching out to the teams that are, you know, like Thomas said, that are building the ecosystem, that are moving the ecosystem to a free threading world. + +00:43:28.100 --> 00:43:31.260 +That's where we really need to spend our effort on. + +00:43:31.380 --> 00:43:32.320 +And we'll get there. + +00:43:32.420 --> 00:43:33.380 +It won't be that long. + +00:43:33.550 --> 00:43:37.140 +It certainly won't be as long as it took us to get to Python 3. + +00:43:42.240 --> 00:43:50.360 +I'm sort of curious, as someone who's not super experienced with threading or basic concurrency, + +00:43:50.760 --> 00:43:56.760 +I mean, I've used it, but I feel like now we have threads, especially with free threading, + +00:43:57.300 --> 00:44:01.160 +and subinterpreters, and multiprocessing, and async.io. + +00:44:01.920 --> 00:44:03.460 +and I feel like for many people + +00:44:03.720 --> 00:44:05.140 +now it's like oh my god + +00:44:05.780 --> 00:44:07.120 +which one am I supposed to use + +00:44:07.760 --> 00:44:08.860 +and for someone who's + +00:44:09.800 --> 00:44:11.320 +experienced you can sort of say well + +00:44:11.600 --> 00:44:12.860 +this seems like a better choice but + +00:44:13.620 --> 00:44:15.580 +are there any plans to sort of try + +00:44:15.800 --> 00:44:17.860 +to have a taxonomy + +00:44:18.020 --> 00:44:19.660 +of what problems are solved + +00:44:19.760 --> 00:44:20.360 +by which of these + +00:44:22.140 --> 00:44:22.540 +I think + +00:44:25.260 --> 00:44:27.700 +so the premise here + +00:44:27.820 --> 00:44:29.600 +is that everyone would be using + +00:44:30.120 --> 00:44:32.960 +one or more of these low-level techniques that you mentioned. + +00:44:33.860 --> 00:44:36.800 +And I think that's not a good way of looking at it. + +00:44:37.320 --> 00:44:39.960 +AsyncIO is a library that you want to use + +00:44:41.460 --> 00:44:43.560 +for the things that AsyncIO is good at. + +00:44:44.480 --> 00:44:46.960 +And you can actually very nicely combine it + +00:44:47.200 --> 00:44:49.620 +with multiprocessing, with subprocesses, + +00:44:51.700 --> 00:44:55.520 +so that subprocesses and subinterpreters, + +00:44:55.660 --> 00:44:58.080 +just to make it clear that those are two very separate things, + +00:44:58.280 --> 00:45:01.460 +and multi-threading, both with and without free threading. + +00:45:02.100 --> 00:45:06.660 +And it solves different problems or it gives you different abilities + +00:45:06.910 --> 00:45:08.720 +within the AsyncIO framework. + +00:45:08.790 --> 00:45:11.140 +And the same is true for like GUI frameworks. + +00:45:12.870 --> 00:45:15.880 +I mean, GUI frameworks usually want threads for multiple reasons, + +00:45:16.200 --> 00:45:18.480 +but you can use these other things as well. + +00:45:18.880 --> 00:45:22.960 +I don't think it's down to teaching end users when to use + +00:45:22.960 --> 00:45:24.780 +or avoid all these different things. + +00:45:25.420 --> 00:45:30.360 +I think we need higher level abstractions for tasks that people want to solve. + +00:45:31.110 --> 00:45:36.800 +And then those can decide on what for their particular use case is a better approach. + +00:45:38.420 --> 00:45:41.000 +For instance, PyTorch has multiple. + +00:45:41.340 --> 00:45:53.220 +So it's used for people who don't know to train, not just train, but it's used in AI for generating large matrices and LLMs and what have you. + +00:45:54.600 --> 00:45:58.880 +Part of it is loading data and processing it. + +00:46:00.100 --> 00:46:04.220 +And the basic ideas of AsyncIO are, + +00:46:04.420 --> 00:46:06.280 +oh, you can do all these things in parallel + +00:46:06.600 --> 00:46:08.080 +because you're not waiting on the CPU, + +00:46:08.400 --> 00:46:09.400 +you're just waiting on I.O. + +00:46:10.080 --> 00:46:13.280 +Turns out it is still a good idea to use threads + +00:46:13.500 --> 00:46:15.020 +for massively parallel I.O. + +00:46:15.160 --> 00:46:18.360 +because otherwise you end up waiting longer than you need to. + +00:46:18.640 --> 00:46:22.880 +So a problem where we thought AsyncIO would be the solution + +00:46:22.960 --> 00:46:24.420 +and we never needed threads + +00:46:24.460 --> 00:46:28.520 +is actually much improved if we tie in threads as well. + +00:46:29.220 --> 00:46:33.280 +And we've seen massive, massive improvements in data loader. + +00:46:33.420 --> 00:46:38.880 +There's even an article, a published article from some people at Meta + +00:46:39.400 --> 00:46:44.660 +showing how much they improve the PyTorch data loader by using multiple threads. + +00:46:45.240 --> 00:46:49.240 +But at a very low level, we don't want end users to need to make that choice, right? + +00:46:49.940 --> 00:46:50.060 +Yeah. + +00:46:50.380 --> 00:46:50.560 +Yeah. + +00:46:50.720 --> 00:46:53.280 +I mean, I concurred.futures is a good point, right? + +00:46:53.340 --> 00:46:57.140 +Like all of these approaches are all supported there and it's a unified one. + +00:46:57.880 --> 00:46:59.960 +So if you were to teach this, for instance, you could say use concurrent + +00:47:00.080 --> 00:47:00.340 +features. + +00:47:01.230 --> 00:47:01.840 +These are all there. + +00:47:02.359 --> 00:47:03.940 +This is the potential trade-off. + +00:47:04.030 --> 00:47:05.440 +Like basically use threads. + +00:47:05.560 --> 00:47:09.000 +It's going to be the fastest unless there's like some module you have + +00:47:09.060 --> 00:47:11.860 +that's not, that's screwing up because of threads and use sub interpreters. + +00:47:13.040 --> 00:47:15.900 +And if for some reason sub interpreters don't work, you should move to the + +00:47:16.500 --> 00:47:17.960 +processing pool, the process pool. + +00:47:18.580 --> 00:47:21.920 +But I mean, basically you just kind of just like, it's not, go sort the fast + +00:47:21.940 --> 00:47:23.860 +stuff and for some reason it doesn't work use the next + +00:47:24.020 --> 00:47:25.420 +fastest and just kind of do it that way + +00:47:25.819 --> 00:47:27.700 +after that then you start to the lower + +00:47:27.920 --> 00:47:29.660 +level like okay why + +00:47:29.880 --> 00:47:31.820 +do I want to use sub interpreters instead of threads + +00:47:32.619 --> 00:47:33.600 +and those kind of threads + +00:47:33.820 --> 00:47:35.100 +but I think that's a different + +00:47:36.020 --> 00:47:37.800 +I think we're all saying a different level + +00:47:37.960 --> 00:47:39.820 +of abstraction which is a term we keep + +00:47:39.920 --> 00:47:40.320 +bringing up today + +00:47:41.920 --> 00:47:43.420 +and so I + +00:47:44.079 --> 00:47:45.820 +think it's a level that a lot of people are not + +00:47:45.920 --> 00:47:47.840 +going to have to care about I think the libraries are the ones that have + +00:47:47.880 --> 00:47:49.840 +to care about this and who are going to do a lot + +00:47:49.840 --> 00:47:51.900 +of this for you. I agree I would + +00:47:51.920 --> 00:47:56.020 +Let me throw this out on our way out the door to get to Reuven's topic. + +00:47:56.490 --> 00:48:00.120 +I would love to see it solidify around async and await, + +00:48:00.480 --> 00:48:02.060 +and you just await a thing. + +00:48:02.360 --> 00:48:03.500 +Maybe put a decorator on something. + +00:48:03.550 --> 00:48:06.000 +Say, I want this to be threaded. + +00:48:06.170 --> 00:48:07.360 +I want this to be IO. + +00:48:09.780 --> 00:48:13.040 +And you just use async and await and don't have to think about it. + +00:48:13.110 --> 00:48:15.180 +But that's my dream. + +00:48:15.799 --> 00:48:16.900 +Reuven, what's your dream? + +00:48:18.619 --> 00:48:19.780 +Well, how long do you have? + +00:48:20.940 --> 00:48:21.520 +What's your topic? + +00:48:22.559 --> 00:48:28.220 +So I want to talk about the Python ecosystem and funding. + +00:48:29.400 --> 00:48:34.320 +And when I talk to people about Python and I talk to them about how it's open source, they're like, oh, right, it's open source. + +00:48:34.480 --> 00:48:35.720 +That means I can download it for free. + +00:48:35.890 --> 00:48:39.060 +And from their perspective, that's sort of where it starts and ends. + +00:48:39.540 --> 00:48:47.480 +And the notion that people work on it, the notion that needs funding, the notion that there's a Python software foundation that supports a lot of these activities, + +00:48:47.630 --> 00:48:50.680 +the infrastructure is completely unknown to them. + +00:48:51.080 --> 00:48:58.140 +even quite shocking for them to hear. But Python is in many ways, I think, starting to become a + +00:48:58.270 --> 00:49:03.960 +victim of its own success, that we've been dependent on companies for a number of years + +00:49:04.050 --> 00:49:11.380 +to support developers and development. And we've been assuming that the PSF, which gives money out + +00:49:11.500 --> 00:49:16.360 +to lots of organizations to run conferences and workshops and so forth, can sort of keep scaling + +00:49:16.360 --> 00:49:22.700 +up and that they will have enough funding. And we've seen a few sort of shocks that system in + +00:49:22.700 --> 00:49:27.940 +the last year. Most recently, the PSF announced that it was no longer going to be doing for versus + +00:49:28.030 --> 00:49:33.260 +sort of pared down about a year ago, what it would give money for. And then about five months ago, + +00:49:33.360 --> 00:49:36.060 +six months ago, I think it was in July or August, they said, actually, we're not gonna be able to + +00:49:36.080 --> 00:49:41.340 +fund anything for about a year now. And then there was the government grant, I think from the NSF + +00:49:41.740 --> 00:49:45.760 +that they turned down. And I'm not disputing the reasons for that at all. It basically, it said, + +00:49:45.860 --> 00:49:49.060 +well, we'll give you the money if you don't worry about diversity and inclusion. And given that + +00:49:49.120 --> 00:49:53.720 +that's like a core part of what the PSF is supposed to do, they could not do that without + +00:49:54.140 --> 00:49:59.420 +shutting the doors, which would be kind of counterproductive. And so I feel like we're + +00:49:59.800 --> 00:50:06.800 +not yet there, but we're sort of approaching this, talking about a term like a problem crisis in + +00:50:07.060 --> 00:50:12.060 +funding Python, that the needs of the community keep growing and growing, whether it's workshops, + +00:50:12.200 --> 00:50:19.460 +whether it's PyPI, whether it's conferences, and companies are getting squeezed. And the number of + +00:50:19.520 --> 00:50:24.900 +people, it always shocks me every time there are PSF elections, the incredibly small number of people + +00:50:25.140 --> 00:50:31.640 +who vote, which means that, let's assume half the people who are members, like for the millions and + +00:50:31.860 --> 00:50:37.599 +millions of people who program Python out there, an infinitesimally small proportion of them actually + +00:50:37.600 --> 00:50:42.520 +join and help to fund it. So I'm not quite sure what to do with this other than express concern. + +00:50:43.920 --> 00:50:47.580 +But I feel like we've got to figure out ways to fund Python and the PSF + +00:50:47.890 --> 00:50:51.420 +in new ways that will allow it to grow and scale as needed. + +00:50:54.480 --> 00:51:01.940 +Yeah, I couldn't agree more. I mean, obviously, the PSF is close to my heart, + +00:51:02.900 --> 00:51:09.440 +because i was on the board for i think a total of six or nine years or something uh over you know + +00:51:09.490 --> 00:51:17.140 +the last 25. um i was also uh for six months i was the interim general manager because uh eva left + +00:51:17.280 --> 00:51:23.120 +and we hadn't hired deb yet uh while i was on the board so i i i remember signing the sponsor + +00:51:23.580 --> 00:51:29.599 +contracts for the companies that came in uh wanting to sponsor python and it is like it's + +00:51:29.620 --> 00:51:34.940 +ridiculous how and i can say this working for a company that is one of the biggest sponsors of the + +00:51:35.040 --> 00:51:42.080 +psf and has done so for years it's ridiculous how small those sponsorships are and yet how grateful + +00:51:42.320 --> 00:51:49.020 +we were that they came in because every single one has such a big impact you can do so much good with + +00:51:49.120 --> 00:51:57.520 +the money that comes in uh we need more corporate sponsorships more than we need like i mean obviously + +00:51:57.580 --> 00:52:01.960 +a million people giving us a couple of bucks, giving the PSF, let's be clear, I'm not on the + +00:52:02.080 --> 00:52:09.320 +board anymore. Giving the PSF a couple of bucks would be fantastic. But I think the big players + +00:52:10.440 --> 00:52:15.860 +in the big corporate players, where all the AI money is, for instance, having done basically + +00:52:16.180 --> 00:52:24.920 +no sponsorship of the PSF is mind boggling. It is a textbook fallacy or no, a tragedy of the + +00:52:24.860 --> 00:52:25.880 +commons right there, right? + +00:52:26.000 --> 00:52:32.600 +They rely entirely on PyPI and PyPI is run entirely with community resources, mostly + +00:52:32.760 --> 00:52:38.220 +because of very, very generous and consistent, sponsorship, basically by Fastly. + +00:52:38.900 --> 00:52:46.400 +but also the other sponsors of the PSF, and yet a lot, very large players use + +00:52:46.640 --> 00:52:51.520 +those resources more than anyone else and don't actually contribute. + +00:52:51.640 --> 00:52:54.560 +And I, yes. + +00:52:55.440 --> 00:52:55.720 +Frustrating. + +00:52:56.580 --> 00:52:57.640 +I mean, oh. + +00:52:59.920 --> 00:53:01.040 +It's okay, Jodi, go ahead. + +00:53:01.820 --> 00:53:04.380 +I was going to say, Georgie Kerr, + +00:53:04.880 --> 00:53:08.160 +she wrote this fantastic blog post saying pretty much this + +00:53:08.320 --> 00:53:09.440 +straight after Europython. + +00:53:10.400 --> 00:53:13.260 +So Europython this year was really big, actually, + +00:53:13.700 --> 00:53:16.360 +and she was wandering around looking at the sponsor booths, + +00:53:17.100 --> 00:53:18.500 +and the usual players were there, + +00:53:18.880 --> 00:53:21.180 +but none of these AI companies were there. + +00:53:21.940 --> 00:53:26.220 +and the relationship actually between AI, if you want to call it that, + +00:53:26.460 --> 00:53:28.300 +let's call it ML and neural networks, + +00:53:29.160 --> 00:53:33.080 +and some of the really big companies, and Python actually is really complex. + +00:53:33.360 --> 00:53:37.120 +Obviously, a lot of these companies, and some of us are here, + +00:53:37.580 --> 00:53:39.700 +employ people to work on Python. + +00:53:40.820 --> 00:53:44.440 +Companies like Meta and Google have contributed massively to frameworks + +00:53:44.840 --> 00:53:47.280 +like PyTorch, TensorFlow, Keras. + +00:53:49.540 --> 00:53:53.280 +So it's not as simple a picture as saying cough up money all the time. + +00:53:53.420 --> 00:53:55.300 +Like there's a more complex picture here, + +00:53:55.480 --> 00:53:59.180 +but definitely there are some notable absences. + +00:54:00.220 --> 00:54:03.160 +And we talked about the volume of money going through. + +00:54:05.320 --> 00:54:06.700 +I totally agree with the sentiment. + +00:54:06.800 --> 00:54:11.840 +Like when the shortfall came and the grants program had to shut down, + +00:54:12.640 --> 00:54:16.680 +we were brainstorming at JetBrains like maybe we can do some sort of, + +00:54:17.520 --> 00:54:21.900 +I don't know, donate some more money and call other companies to do it, or we can call on people + +00:54:21.930 --> 00:54:26.660 +in the community. And I was like, I don't want to call on people in the community to do it, + +00:54:27.000 --> 00:54:31.340 +because they're probably the same people who are also donating their time for Python. Like, + +00:54:31.700 --> 00:54:37.040 +it's just squeezing people who give so much of themselves to this community even more. + +00:54:37.680 --> 00:54:42.380 +And it's not sustainable. Like Reuben said, if we keep doing this, the whole community is going to + +00:54:42.280 --> 00:54:47.680 +collapse like i'm sure we've all had our own forms of burnout from giving too much + +00:54:48.940 --> 00:54:53.280 +yeah i mean i want to say i'm going to pat ourselves on the back here everyone on this + +00:54:53.440 --> 00:54:59.640 +call who works at a company are all sponsors of the psf thank goodness but there's obviously a lot + +00:54:59.640 --> 00:55:03.780 +of people not on this call who are not sponsors and i know personally i wished every company that + +00:55:03.920 --> 00:55:11.200 +generated revenue from python usage donated to the psf like and it doesn't see i and i think part + +00:55:11.220 --> 00:55:14.740 +the problem is some people think it has to be like a hundred thousand dollars or something it does not + +00:55:14.770 --> 00:55:19.800 +have to be a hundred thousand dollars now if you can afford that amount please do or more there are + +00:55:19.940 --> 00:55:26.360 +many ways to donate more than the maximum amount for uh getting on the website um but it's one of + +00:55:26.360 --> 00:55:30.600 +these funny things where a lot of people just like oh it's not me right like even startups don't some + +00:55:30.780 --> 00:55:35.480 +do to give those ones credit but others don't because like oh we're we're burning through capital + +00:55:35.500 --> 00:55:40.620 +blah blah blah i was like yeah but we're not we're asking for like less than you'd pay a dev + +00:55:41.340 --> 00:55:47.300 +right by a lot per year right like the amount we actually asked for to get to the highest tier is + +00:55:47.420 --> 00:55:52.700 +still less than a common developer in silicon valley if we're gonna price point to a geograph + +00:55:52.940 --> 00:55:58.840 +geographical location we call kind of comprehend right i'm gonna steal annette bachelder's + +00:55:58.920 --> 00:56:06.940 +observation here and yeah what what the psf would be happy with is less than a medium company spends + +00:56:07.200 --> 00:56:15.160 +on the tips of expensed meals every year yeah and and yeah yeah and it's a long-running problem + +00:56:15.320 --> 00:56:21.280 +right like i i mean i i've been on the psf for a long time too i've not served as many years as + +00:56:21.580 --> 00:56:25.020 +uh it's almost on the board but i was like executive vice president because we had to + +00:56:24.860 --> 00:56:31.840 +someone with that title uh at some point um but it's it's always been a struggle right like iree + +00:56:32.300 --> 00:56:36.780 +and i also want to be clear i'm totally appreciative of where we have gotten too right because for the + +00:56:36.780 --> 00:56:42.160 +longest time i was just dying for paid staff on the core team and now we have three developers + +00:56:42.250 --> 00:56:47.760 +in presence thank goodness still not enough to be clear i want five and i've always said that but + +00:56:48.160 --> 00:56:52.520 +i'll happily take three but it's one of these things where it's a constant struggle and it it + +00:56:52.540 --> 00:56:56.660 +It got a little bit better before the pandemic just because everyone was spending on conferences. + +00:56:57.040 --> 00:56:59.600 +And PyCon US is a big driver for the Python Software Foundation. + +00:57:00.300 --> 00:57:03.680 +And I know your Python's a driver for the, your Python Society. + +00:57:04.780 --> 00:57:07.640 +But then COVID hit and conferences haven't picked back up. + +00:57:07.790 --> 00:57:16.000 +And then there's a whole new cohort of companies that have come in post-pandemic that have never had that experience of going to PyCon and sponsoring PyCon. + +00:57:16.070 --> 00:57:21.580 +And so they don't think about, I think, sponsoring PyCon or the PSF because that's also a big kind of in your face. + +00:57:21.760 --> 00:57:22.800 +You should help sponsor this. + +00:57:23.550 --> 00:57:27.020 +And I think it's led to this kind of lull where offered spending has gone down. + +00:57:27.820 --> 00:57:31.480 +New entrants into the community have not had that experience and thought about it. + +00:57:31.640 --> 00:57:36.380 +And it's led to this kind of dearth where, yeah, that PSF had to stop giving out grant money. + +00:57:36.740 --> 00:57:37.220 +And it sucks. + +00:57:37.880 --> 00:57:40.120 +And I would love to see it not be that problem. + +00:57:41.280 --> 00:57:44.920 +I want to add one interesting data point that I discovered in preparing for today. + +00:57:47.019 --> 00:57:49.740 +NumFocus has about twice the budget of the PSF. + +00:57:50.900 --> 00:57:52.580 +I was shocked to discover this. + +00:57:54.040 --> 00:57:57.500 +So basically, it is possible to get money from companies + +00:57:58.020 --> 00:58:00.920 +to sponsor development of Python-related projects. + +00:58:01.520 --> 00:58:03.580 +And I don't know what they're doing that we aren't. + +00:58:04.680 --> 00:58:07.300 +And I think it's worth talking and figuring it out. + +00:58:09.839 --> 00:58:15.140 +We need a fundraiser and marketer in residence, maybe. + +00:58:15.400 --> 00:58:15.660 +Who knows? + +00:58:17.040 --> 00:58:18.780 +Lauren does a great job, to be clear. + +00:58:19.380 --> 00:58:19.820 +Who does that? + +00:58:19.880 --> 00:58:22.040 +The PSF has Lauren, and Lauren is that. + +00:58:22.040 --> 00:58:22.100 +Okay. + +00:58:22.360 --> 00:58:22.640 +Yes. + +00:58:22.800 --> 00:58:22.960 +Okay. + +00:58:24.560 --> 00:58:25.420 +But it's still hard. + +00:58:25.980 --> 00:58:27.520 +We have someone doing it full-time at the PSF, + +00:58:27.560 --> 00:58:28.360 +and it's just hard to get. + +00:58:28.800 --> 00:58:29.340 +It's, yeah. + +00:58:29.500 --> 00:58:31.260 +It's to give cash up, cop up cash. + +00:58:31.860 --> 00:58:33.180 +Yeah, and what do we get in return? + +00:58:34.100 --> 00:58:35.220 +Well, we already get that, so. + +00:58:35.680 --> 00:58:36.240 +Yeah, I know. + +00:58:36.980 --> 00:58:37.740 +All right, Barry. + +00:58:40.120 --> 00:58:43.760 +Yeah, I guess to just, you know, + +00:58:43.960 --> 00:58:46.600 +shift gears into a different area, + +00:58:46.940 --> 00:58:52.880 +something that I've been thinking a lot over this past year on the steering council. Thomas, I'm + +00:58:53.000 --> 00:59:01.140 +sure, is going to be very well aware, having been instrumental in the lazy imports PEP A10. + +00:59:04.680 --> 00:59:14.700 +We have to sort of rethink how we evolve Python and how we propose changes to Python and how we + +00:59:14.660 --> 00:59:16.680 +discuss those changes in the community. + +00:59:16.750 --> 00:59:23.500 +Because I think one of the things that I have heard over and over and over again is that + +00:59:25.100 --> 00:59:32.300 +authoring PEPs is incredibly difficult and emotionally draining. + +00:59:33.020 --> 00:59:34.460 +And it's a time sink. + +00:59:36.780 --> 00:59:42.720 +And leading those discussions on discuss.python.org, which we typically call DPO, + +00:59:43.460 --> 00:59:51.600 +is um is can be toxic at times and very difficult so one of the things that i i i realized as i was + +00:59:51.760 --> 01:00:00.400 +thinking about this is that peps are 25 years old now right so um we've had this and not only just + +01:00:00.580 --> 01:00:07.360 +peps are old but like we've gone through at least two if not more sort of complete revolutions in + +01:00:07.300 --> 01:00:09.140 +and the way we discuss things, you know. + +01:00:10.260 --> 01:00:11.980 +The community has grown incredibly. + +01:00:12.860 --> 01:00:16.060 +The developer community is somewhat larger, + +01:00:17.120 --> 01:00:19.740 +but just the number of people who are using Python + +01:00:20.080 --> 01:00:24.540 +and who have an interest in it has grown exponentially. + +01:00:24.960 --> 01:00:30.580 +So it has become really difficult to evolve the language + +01:00:30.630 --> 01:00:32.440 +in the standard library and the interpreter. + +01:00:33.120 --> 01:00:42.480 +And we need to sort of think about how we can make this easier for people and not lose the voice of the user. + +01:00:42.900 --> 01:00:49.580 +And I mean, you know, the number of people who actually engage in topics on DPO is the tip of the iceberg. + +01:00:49.770 --> 01:01:00.380 +You know, we've got millions and millions of users out there in the world who, for example, lazy imports will affect, free threading will affect and don't even know that they have a voice. + +01:01:00.640 --> 01:01:04.020 +And maybe we have to basically represent that, + +01:01:04.020 --> 01:01:07.740 +but we have to do it in a much more collaborative and positive way. + +01:01:08.780 --> 01:01:11.040 +So that's something that I've been thinking about a lot. + +01:01:11.180 --> 01:01:14.240 +And whether or not I'm on the steering council next year, + +01:01:14.260 --> 01:01:17.160 +I think this is something that I'm going to spend some time on + +01:01:17.500 --> 01:01:21.380 +trying to think about and talk to people about ways we can make this easier for everyone. + +01:01:22.520 --> 01:01:26.100 +The diversity of use cases for Python in the last couple of years. + +01:01:26.200 --> 01:01:26.780 +So complex. + +01:01:27.440 --> 01:01:28.060 +Yes, exactly. + +01:01:29.160 --> 01:01:31.400 +It should also be prefaced that Barry created the PEP process. + +01:01:31.680 --> 01:01:33.000 +He should have started that one. + +01:01:36.180 --> 01:01:37.560 +It is that old. + +01:01:38.600 --> 01:01:38.740 +Yeah. + +01:01:40.000 --> 01:01:41.480 +By the way, just so everyone knows, + +01:01:41.620 --> 01:01:43.620 +these are not ages jokes to be mean to Barry. + +01:01:44.100 --> 01:01:44.640 +No, no. + +01:01:44.840 --> 01:01:46.120 +All of us have known Barry long enough + +01:01:46.160 --> 01:01:48.560 +that we know Barry's okay with us making these jokes. + +01:01:48.600 --> 01:01:49.340 +To be very, very... + +01:01:49.360 --> 01:01:51.560 +Also, I am almost as old as Barry, + +01:01:51.780 --> 01:01:53.360 +although I don't look as old as Barry. + +01:01:54.300 --> 01:01:55.800 +Yeah, we're all over the same age anyway. + +01:01:57.920 --> 01:02:04.640 +Yeah, Barry and I have known each other for 25 years, and I've always made these jokes of him. + +01:02:04.820 --> 01:02:10.240 +So it is different when you know each other in person, let's put it that way. + +01:02:12.760 --> 01:02:20.060 +I think for the PEP process, I think for a lot of people, it's not obvious how difficult the process is. + +01:02:20.300 --> 01:02:21.760 +I mean, it wasn't even obvious to me. + +01:02:23.520 --> 01:02:29.060 +I saw people avoiding writing peps multiple times and I was upset, like on the steering + +01:02:29.250 --> 01:02:29.700 +council, right? + +01:02:30.360 --> 01:02:34.460 +I saw people making changes where I thought this is definitely something that should have + +01:02:34.460 --> 01:02:38.660 +been discussed in a PEP and the discussion should be recorded in a PEP and all that. + +01:02:39.360 --> 01:02:44.680 +And I didn't understand why they didn't until basically until PEP 810. + +01:02:44.750 --> 01:02:52.920 +So I did PEP 779, which was the, giving free threading, supported status at + +01:02:52.940 --> 01:02:59.220 +the start of the year. And the discussion there was, you know, sort of as expected and it's already, + +01:02:59.300 --> 01:03:03.100 +was already an accepted peb. It was just the question of how does it become supported? + +01:03:03.780 --> 01:03:10.860 +That one wasn't too exhausting. And then we got to Lazy Imports, which was Pablo, who is + +01:03:11.760 --> 01:03:17.240 +another steering council member, as well as a bunch of other contributors, including me and two of my + +01:03:17.240 --> 01:03:22.200 +co-workers and one of my former co-workers, who had all had a lot of experience with Lazy Imports, + +01:03:22.280 --> 01:03:25.300 +but not necessarily as much experience with the PEP process. + +01:03:26.280 --> 01:03:30.940 +And Pablo took the front seat because he knew the PEP process and he's done + +01:03:31.040 --> 01:03:34.740 +like five peps in the last year or something, some ridiculous number. + +01:03:36.420 --> 01:03:43.600 +And he shared with us the vitriol he got for like offline for the, just the + +01:03:43.980 --> 01:03:46.820 +audacity of proposing something that people disagreed with or something. + +01:03:47.580 --> 01:03:51.860 +And that was like, this is a technical, technical, suggestion. + +01:03:52.140 --> 01:03:58.520 +This is not a code of conduct issue where I have received my fair share of vitriol around. + +01:03:59.400 --> 01:04:05.920 +This is a technical discussion, and yet he gets these ridiculous accusations in his mailbox. + +01:04:06.460 --> 01:04:12.360 +And for some reason, only the primary author gets it as well, which is just weird to me. + +01:04:13.680 --> 01:04:15.280 +I'll point out something. + +01:04:15.570 --> 01:04:18.300 +People are lazy, Thomas, is what I think you just said. + +01:04:19.240 --> 01:04:28.660 +Remember, the steering council exists because Guido got the brunt of this for Pet 572, which was the walrus operator, right? + +01:04:29.070 --> 01:04:34.620 +Which is just like this minor little syntactic thing that is kind of cool when you need it. + +01:04:34.930 --> 01:04:48.120 +But like just the amount of anger and negative energy and vitriol that he got over that was enough for him to just say, I'm out, you know, and you guys figure it out. + +01:04:48.220 --> 01:04:55.700 +And, and that's, that, that cannot be an acceptable way to, discuss the evolution of the language. + +01:04:56.220 --> 01:05:03.320 +Especially since apparently now every single PEP author of, of any contentious or semi-contentious pep. + +01:05:03.700 --> 01:05:07.100 +Although I have to say PEP 810 had such broad support. + +01:05:07.260 --> 01:05:09.200 +It was hard to call it contentious. + +01:05:09.360 --> 01:05:12.660 +It's just, there's a couple of very loud opinions, I guess. + +01:05:13.040 --> 01:05:15.540 +And, and I'm not saying we shouldn't listen to people. + +01:05:15.740 --> 01:05:21.600 +We should definitely listen to, to especially contrary opinions, but there has to be a limit. + +01:05:22.210 --> 01:05:25.360 +There has to be an acceptable way of bringing things up. + +01:05:25.370 --> 01:05:30.380 +There has to be an acceptable way of saying, Hey, you didn't actually read the pep. + +01:05:30.730 --> 01:05:36.360 +Please go back and reconsider everything you said after you fully digested the things, + +01:05:36.840 --> 01:05:38.680 +because everything's already been addressed in the pep. + +01:05:38.900 --> 01:05:47.520 +It's just really hard to do this in a way that doesn't destroy the relationship with the person you're telling this, right? + +01:05:48.280 --> 01:05:54.440 +It's hard to tell people, hey, I'm not going to listen to you because you've done a bad job. + +01:05:55.320 --> 01:05:57.000 +You've chosen not to inform yourself. + +01:05:58.040 --> 01:06:08.100 +And I think you make another really strong point, Thomas, which is that there have been changes that have been made to Python that really should have been a pep. + +01:06:08.660 --> 01:06:11.940 +and they aren't because people don't want to go through, + +01:06:12.080 --> 01:06:14.900 +core developers don't want to go through this gauntlet + +01:06:15.160 --> 01:06:17.160 +and so they'll create a PR and then that, + +01:06:17.210 --> 01:06:19.840 +but that's also not good because then, you know, + +01:06:19.920 --> 01:06:24.200 +we don't have the right level of consideration + +01:06:25.240 --> 01:06:27.260 +and you think about the way that, you know, + +01:06:27.400 --> 01:06:29.460 +if you're in your job and you're making a change + +01:06:29.600 --> 01:06:30.740 +to something in your job, + +01:06:30.830 --> 01:06:34.060 +you have a very close relationship to your teammates + +01:06:34.350 --> 01:06:36.359 +and so you have that kind of respect + +01:06:36.380 --> 01:06:38.260 +and hopefully, right? + +01:06:38.580 --> 01:06:40.400 +Like compassion and consideration. + +01:06:41.060 --> 01:06:44.360 +And you can have a very productive discussion + +01:06:44.620 --> 01:06:46.600 +about a thing and you may win some arguments + +01:06:46.800 --> 01:06:47.840 +and you may lose some arguments, + +01:06:47.930 --> 01:06:50.140 +but the team moves forward as one. + +01:06:50.400 --> 01:06:53.060 +And I think we've lost a bit of that in Python. + +01:06:53.460 --> 01:06:55.760 +So yeah, that's not great. + +01:06:56.070 --> 01:06:58.160 +I think society in general + +01:06:58.230 --> 01:07:00.000 +could use a little more civility and kindness, + +01:07:00.180 --> 01:07:02.519 +especially to strangers that they haven't met + +01:07:02.540 --> 01:07:06.840 +in forums, social media, driving, you name it. + +01:07:07.380 --> 01:07:11.180 +Okay, but we're not going to solve that here, I'm sure. + +01:07:11.460 --> 01:07:13.940 +So instead, let's do Gregory's topic. + +01:07:14.720 --> 01:07:16.780 +Hey, I'm going to change topics quite a bit, + +01:07:16.980 --> 01:07:20.420 +but I wanted to call 2025 the year of type checking + +01:07:20.640 --> 01:07:21.900 +and language server protocols. + +01:07:22.980 --> 01:07:25.880 +So many of us probably have used tools like mypy + +01:07:26.100 --> 01:07:28.240 +to check to see if the types line up in our code + +01:07:28.780 --> 01:07:32.180 +or whether or not we happen to be overriding functions correctly. + +01:07:32.960 --> 01:07:38.720 +And so I've used mypy for many years and loved the tool and had a great opportunity to chat with the creator of it. + +01:07:39.300 --> 01:07:42.940 +And I integrate that into my CI, and it's really been wonderful. + +01:07:43.220 --> 01:07:47.260 +And I've also been using a lot of LSPs, like, for example, PyRite or PyLands. + +01:07:48.420 --> 01:07:53.360 +But in this year, one of the things that we've seen is, number one, Pyrefly from the team at Meta. + +01:07:53.940 --> 01:07:56.420 +We've also seen ty from the team at Astral. + +01:07:56.920 --> 01:08:00.740 +And there's another one called Zubon, and Zubon is from David Halter. + +01:08:01.240 --> 01:08:07.780 +David was also the person who created JEDI, which is another system in Python that helped with a lot of LSP tasks. + +01:08:08.640 --> 01:08:13.900 +What's interesting about all three of the tools that I just mentioned is that they're implemented in Rust, + +01:08:14.760 --> 01:08:22.200 +and they have taken a lot of the opportunity to make the type checker and or the LSP significantly faster. + +01:08:22.960 --> 01:08:29.160 +So for me, this has changed how I use the LSP or the type checker and how frequently I use it. + +01:08:29.700 --> 01:08:38.759 +And in my experience, it has helped me to take things that might take tens of seconds or hundreds of seconds and cut them down often to less than a second. + +01:08:39.440 --> 01:08:46.380 +And it's really changed the way in which I'm using a lot of the tools like ty or Pyrefly or Zubon. + +01:08:46.880 --> 01:08:54.560 +So I can have some more details if I'm allowed to share, Michael, but I would say 2025 is the year of type checkers and LSPs. + +01:08:54.900 --> 01:08:57.460 +I think given the timing, let's have people give some feedback. + +01:08:57.799 --> 01:09:01.859 +I personally have been using Pyrefly a ton and am a big fan of it. + +01:09:03.560 --> 01:09:03.680 +Yeah. + +01:09:06.660 --> 01:09:07.200 +Yeah, I mean. + +01:09:07.339 --> 01:09:11.400 +I don't know if I'm allowed to have an opinion that isn't Pyrefly is awesome. + +01:09:12.640 --> 01:09:17.680 +I mean, I'm not on the Pyrefly team, but I do regularly chat with people from the Pyrefly team. + +01:09:17.980 --> 01:09:19.920 +Tell people real quick what it is, Thomas. + +01:09:20.720 --> 01:09:25.540 +So Pyrefly is meta's attempt at a Rust-based type checker. + +01:09:26.720 --> 01:09:28.400 +And so it's very similar to ty. + +01:09:28.859 --> 01:09:31.700 +It started basically at the same time, a little later. + +01:09:32.720 --> 01:09:35.299 +Meta originally had a type checker called Pyre, + +01:09:35.779 --> 01:09:37.020 +which was written in OCaml. + +01:09:39.460 --> 01:09:42.020 +They basically decided to start a rewrite in Rust, + +01:09:42.279 --> 01:09:44.200 +and then that really took off, + +01:09:44.279 --> 01:09:47.279 +and that's where we're going now. + +01:09:50.380 --> 01:09:51.680 +Yeah, I don't know what I can say, + +01:09:51.819 --> 01:09:54.600 +because I'm actually on the same team as the Pylands team. + +01:09:56.740 --> 01:09:58.340 +So, but no, I mean, I think it's good. + +01:09:58.640 --> 01:10:00.480 +I think this is one of those interesting scenarios + +01:10:01.180 --> 01:10:03.260 +where some people realize like, you know what? + +01:10:03.740 --> 01:10:06.860 +We're going to pay the penalty of writing a tool + +01:10:07.780 --> 01:10:09.620 +in a way that's faster, but makes us go slower + +01:10:09.820 --> 01:10:11.220 +because the overall win for the community + +01:10:11.440 --> 01:10:13.040 +is going to be a good win. + +01:10:13.070 --> 01:10:14.680 +So it's worth that headache, right? + +01:10:15.000 --> 01:10:16.440 +Not to say I don't want to scare people off + +01:10:16.490 --> 01:10:17.900 +from writing Rust, but let's be honest. + +01:10:17.950 --> 01:10:19.320 +It takes more work to write Rust code + +01:10:19.420 --> 01:10:20.620 +than it does take to write Python code. + +01:10:22.700 --> 01:10:24.580 +But some people chose to make that trade off + +01:10:24.580 --> 01:10:25.800 +and we're all benefiting from it. + +01:10:27.340 --> 01:10:29.360 +The one thing I will say that's kind of interesting from this + +01:10:29.720 --> 01:10:31.700 +that hasn't gotten a lot of play yet + +01:10:31.700 --> 01:10:32.520 +because it's still being developed, + +01:10:33.420 --> 01:10:36.140 +but PyLens is actually working with the Pyrefly team + +01:10:36.740 --> 01:10:39.640 +to define a type server protocol, TSP, + +01:10:40.420 --> 01:10:42.020 +so that a lot of these type servers + +01:10:42.220 --> 01:10:43.900 +can just kind of feed the type information + +01:10:44.030 --> 01:10:45.040 +to a higher level LSP + +01:10:45.280 --> 01:10:48.400 +and let that LSP handle the stuff like symbol renaming + +01:10:48.510 --> 01:10:49.260 +and all that stuff, right? + +01:10:49.380 --> 01:10:51.880 +Because the key thing here + +01:10:52.980 --> 01:10:54.520 +and the reason there's so many + +01:10:54.540 --> 01:10:58.200 +different type checkers is there are there is a spec right and everyone's trying to implement it but + +01:10:58.880 --> 01:11:03.520 +there's differences like in type in terms of type inferencing and um if you actually go listen to + +01:11:04.020 --> 01:11:08.600 +michael's interview from talk python to me with the powerfly team they actually did a nice little + +01:11:08.800 --> 01:11:13.940 +explanation of like the difference between pyrites approach and powerfly's approach and so there's a + +01:11:13.940 --> 01:11:18.960 +bit of variance but for instance i think there's some talk now of trying to like how do we make it + +01:11:19.040 --> 01:11:22.480 +so everyone doesn't have to reimplement how to rename a symbol right that's kind of boring that's + +01:11:22.500 --> 01:11:23.840 +where the interesting work is. + +01:11:24.100 --> 01:11:26.460 +And that's not performant from a perspective of + +01:11:26.980 --> 01:11:29.700 +you want instantaneously to get that squiggly red line + +01:11:30.080 --> 01:11:33.880 +in whether it's VS Code or it's in PyCharm + +01:11:33.920 --> 01:11:35.220 +or whatever your editor is, right? + +01:11:35.340 --> 01:11:37.480 +You want to get it as fast as possible. + +01:11:37.660 --> 01:11:38.340 +But the rename... + +01:11:38.340 --> 01:11:38.560 +Jupyter. + +01:11:39.780 --> 01:11:40.080 +Jupyter. + +01:11:40.620 --> 01:11:40.720 +What? + +01:11:41.900 --> 01:11:42.860 +No, not Emacs. + +01:11:43.100 --> 01:11:43.340 +What's that? + +01:11:43.740 --> 01:11:44.320 +No, not Emacs. + +01:11:45.980 --> 01:11:47.620 +But, you know, just to... + +01:11:47.960 --> 01:11:48.300 +Oh, sorry. + +01:11:49.020 --> 01:11:50.240 +Just to bring things full circle, + +01:11:51.240 --> 01:11:54.120 +It's that focus on user experience, right? + +01:11:54.420 --> 01:11:56.500 +Which is, yes, you want that squiggly line, + +01:11:56.860 --> 01:11:58.080 +but when things go wrong, + +01:11:58.640 --> 01:11:59.580 +when your type checker says, + +01:11:59.600 --> 01:12:00.620 +oh, you've got a problem, + +01:12:01.920 --> 01:12:04.960 +how, you know, like I think about as an analogy, + +01:12:06.540 --> 01:12:08.760 +how Pablo has done an amazing amount of work + +01:12:10.200 --> 01:12:12.140 +on the error reporting, right? + +01:12:12.220 --> 01:12:14.680 +When you get an exception and, you know, + +01:12:15.020 --> 01:12:17.680 +now you have a lot more clues about + +01:12:18.160 --> 01:12:20.300 +what is it that I actually have to change + +01:12:20.680 --> 01:12:24.200 +to make the tool, you know, to fix the problem, right? + +01:12:24.520 --> 01:12:28.140 +Like so many times years ago, you know, + +01:12:28.140 --> 01:12:29.920 +when people were using mypy, for example, + +01:12:30.360 --> 01:12:35.160 +and they'd have some complex failure of their type annotations + +01:12:35.440 --> 01:12:38.660 +and have absolutely no idea what to do about it. + +01:12:38.840 --> 01:12:42.360 +And so getting to a place where now we're not just telling people + +01:12:42.760 --> 01:12:47.780 +you've done it wrong, but also here's some ideas about how to fix it. + +01:12:48.560 --> 01:12:49.120 +That's the reason. + +01:12:49.600 --> 01:12:51.400 +Rather than, I guess we're going to have a squiggly. + +01:12:51.410 --> 01:12:51.520 +How are you telling AI? + +01:12:53.680 --> 01:12:55.760 +I think this is a full circle here, + +01:12:55.890 --> 01:12:59.060 +because honestly, using typing in your Python code + +01:12:59.320 --> 01:13:01.740 +gives a lot of context to the AI when you ask for help + +01:13:01.910 --> 01:13:02.960 +if you just give it a fragment. + +01:13:05.700 --> 01:13:06.440 +That's true. + +01:13:06.740 --> 01:13:09.580 +And also, if you can teach your AI agent + +01:13:09.740 --> 01:13:12.700 +to use the type checkers and use the LSPs, + +01:13:13.100 --> 01:13:15.140 +it will also generate better code for you. + +01:13:15.720 --> 01:13:17.420 +I think the one challenge I would add + +01:13:17.640 --> 01:13:19.560 +to what Barry said a moment ago + +01:13:19.580 --> 01:13:22.460 +is that if you're a developer and you're using, say, + +01:13:22.580 --> 01:13:24.620 +three or four type checkers at the same time, + +01:13:25.160 --> 01:13:27.220 +you also have to be careful about the fact + +01:13:27.740 --> 01:13:29.960 +that some of them won't flag an error + +01:13:30.040 --> 01:13:31.560 +that the other one will flag. + +01:13:32.180 --> 01:13:34.620 +So I've recently written Python programs + +01:13:34.900 --> 01:13:37.740 +and even built a tool with one of my students named Benedict + +01:13:38.200 --> 01:13:41.240 +that will automatically generate Python programs + +01:13:41.520 --> 01:13:43.980 +that will cause type checkers to disagree with each other. + +01:13:44.400 --> 01:13:46.080 +There are cases that, + +01:13:46.900 --> 01:13:49.080 +mypy will flag it as an error, + +01:13:49.900 --> 01:13:52.920 +but none of the other tools will take an error. + +01:13:53.470 --> 01:13:57.040 +And there are also cases where the new tools will all agree with each other, + +01:13:57.170 --> 01:13:58.480 +but disagree with mypy. + +01:13:58.960 --> 01:14:01.920 +So there is a type checker conformance test suite. + +01:14:02.300 --> 01:14:03.500 +But I think as developers, + +01:14:03.870 --> 01:14:06.240 +even though it might be the year of LSP and type checker, + +01:14:06.240 --> 01:14:09.920 +we also have to be aware of the fact that these tools are maturing + +01:14:10.210 --> 01:14:12.200 +and there's still disagreement among them. + +01:14:12.570 --> 01:14:15.920 +And also just different philosophies when it comes to how to type check + +01:14:16.360 --> 01:14:17.140 +and how to infer. + +01:14:17.780 --> 01:14:19.480 +And so we have to think about all of those things + +01:14:19.560 --> 01:14:21.800 +as these tools mature and become part of our ecosystem. + +01:14:22.120 --> 01:14:23.940 +Yeah, Greg, that last point is important. + +01:14:24.340 --> 01:14:24.700 +Sorry, go ahead. + +01:14:25.280 --> 01:14:27.780 +Out of curiosity, how did the things + +01:14:28.000 --> 01:14:29.220 +where the type checkers disagree + +01:14:29.920 --> 01:14:32.660 +match up with the actual runtime behavior of Python? + +01:14:33.580 --> 01:14:36.560 +Was it like false positives or false negatives? + +01:14:37.520 --> 01:14:38.920 +Ah, that's a really good question. + +01:14:39.100 --> 01:14:40.680 +I'll give you more details in the show notes + +01:14:40.820 --> 01:14:43.060 +because we actually have it in a GitHub repository + +01:14:43.380 --> 01:14:44.500 +and I can share it with people. + +01:14:45.120 --> 01:14:47.740 +But I think some of it might simply be + +01:14:47.760 --> 01:14:52.640 +related to cases where mypy is more conformant to the spec, + +01:14:52.690 --> 01:14:56.900 +but the other new tools are not as conformant. + +01:14:57.320 --> 01:14:59.540 +So you can import overload from typing + +01:15:00.200 --> 01:15:02.280 +and then have a very overloaded function. + +01:15:03.100 --> 01:15:05.900 +And mypy will actually flag the fact + +01:15:06.040 --> 01:15:08.380 +that it's an overloaded function with multiple signatures, + +01:15:09.130 --> 01:15:11.860 +whereas PyRite and Pyrefly and Zubon + +01:15:12.300 --> 01:15:14.720 +will not actually flag that even though they should. + +01:15:16.640 --> 01:15:22.240 +Yeah. Another big area is optional versus not optional. Like, are you allowed to pass a thing + +01:15:22.300 --> 01:15:27.100 +that is an optional string when the thing accepts a string? Some stuff's like, yeah, + +01:15:27.160 --> 01:15:31.080 +it's probably fine. Others are like, no, no, no. This is an error that you have to do a check. And + +01:15:31.300 --> 01:15:35.600 +if you want to switch type checkers, you might end up with a thousand warnings that you didn't + +01:15:35.700 --> 01:15:41.000 +previously had because of an intentional difference of opinion on how strict to be, I think. + +01:15:41.260 --> 01:15:45.900 +Yeah. So you have to think about false positives and false negatives when you're willing to break + +01:15:45.920 --> 01:15:50.600 +the build because of a type error. All of those things are things you have to factor in. But to go + +01:15:50.780 --> 01:15:56.660 +quickly to this connection to AI, I know it's only recently, but the Pyrefly team actually announced + +01:15:57.240 --> 01:16:03.420 +that they're making Pyrefly work directly with Pydantic AI. So there's going to be an interoperability + +01:16:03.610 --> 01:16:09.120 +between those tools so that when you're building an AI agent using Pydantic AI, you can also then + +01:16:09.260 --> 01:16:14.740 +have better guarantees when you're using Pyrefly as your type checker. It makes total sense though, + +01:16:14.860 --> 01:16:17.680 +because then the reasoning LLM that's at the core of the agent + +01:16:18.120 --> 01:16:22.580 +can actually have that information before it tries to execute the code. + +01:16:22.820 --> 01:16:27.040 +And you don't get in that loop that they often get in. + +01:16:28.100 --> 01:16:29.460 +It can correct it before it runs. + +01:16:29.980 --> 01:16:31.040 +Yeah, really good point. + +01:16:31.980 --> 01:16:33.960 +I want to just sort of express my appreciation + +01:16:34.080 --> 01:16:36.460 +to all the people working on this typing stuff. + +01:16:37.100 --> 01:16:40.300 +As someone who's come from many, many years in dynamic languages, + +01:16:40.860 --> 01:16:42.940 +and I was always like, oh, typing. + +01:16:43.360 --> 01:16:45.700 +Those silly people, those statically type languages. + +01:16:47.180 --> 01:16:51.480 +And I see, A, I appreciate the value. + +01:16:52.120 --> 01:16:57.660 +B, I love seeing how easy it is for people to ease into it when they're in Python. + +01:16:57.790 --> 01:16:58.840 +It's not all or nothing. + +01:16:59.880 --> 01:17:02.820 +C, I love the huge number of tools. + +01:17:02.950 --> 01:17:05.760 +The competition in this space is really exciting. + +01:17:06.220 --> 01:17:08.200 +And D, guess what? + +01:17:08.290 --> 01:17:09.440 +It really, really does help. + +01:17:09.740 --> 01:17:15.500 +And I'll even add an E, which is my students who come from Java, C++, C#, and so forth, feel relief. + +01:17:16.280 --> 01:17:21.380 +They find that without type checking, it's like doing a trapeze act without a safety net. + +01:17:21.820 --> 01:17:26.700 +And so they're very happy to have that typing in there, typings in there. + +01:17:27.140 --> 01:17:28.520 +So kudos to everyone. + +01:17:29.180 --> 01:17:29.440 +Yeah. + +01:17:30.340 --> 01:17:30.700 +All right, folks. + +01:17:30.920 --> 01:17:31.500 +We are out of time. + +01:17:31.780 --> 01:17:33.920 +This could literally go for hours longer. + +01:17:35.620 --> 01:17:36.300 +It was a big year. + +01:17:36.880 --> 01:17:37.720 +It was a big year. + +01:17:37.880 --> 01:17:41.060 +I think we need to just have a final word. + +01:17:42.840 --> 01:17:44.540 +I'll start and we'll just go around. + +01:17:44.970 --> 01:17:49.400 +So my final thought here is we've talked about some things that are negatives + +01:17:49.760 --> 01:17:51.820 +or sort of downers or whatever here and there, + +01:17:52.940 --> 01:17:58.120 +but I still think it's an incredibly exciting time to be a developer, + +01:17:58.310 --> 01:17:58.760 +data scientist. + +01:17:59.020 --> 01:18:00.500 +There's so much opportunity out there. + +01:18:01.080 --> 01:18:04.020 +There's so many things to learn and take advantage of and stay on top of. + +01:18:05.900 --> 01:18:06.240 +Amazing. + +01:18:06.420 --> 01:18:08.800 +Every day is slightly more amazing than the previous day. + +01:18:08.870 --> 01:18:10.140 +So I love it. + +01:18:10.960 --> 01:18:11.820 +Gregory, let's go to you next. + +01:18:11.920 --> 01:18:12.520 +Let's go around the circle. + +01:18:13.020 --> 01:18:16.960 +Yeah, I wanted to give a shout out to all of the local Python conferences. + +01:18:17.740 --> 01:18:21.180 +I actually, on a regular basis, have attended the PyOhio conference. + +01:18:21.980 --> 01:18:23.020 +And it is incredible. + +01:18:23.320 --> 01:18:26.400 +The organizers do an absolutely amazing job. + +01:18:27.020 --> 01:18:32.120 +And they have it hosted on a campus, oftentimes at Ohio State or Cleveland State University. + +01:18:33.080 --> 01:18:39.180 +And incredibly, PyOhio is a free conference that anyone can attend with no registration fee. + +01:18:39.800 --> 01:18:47.100 +So Michael, on a comment that I think is really positive, wow, I'm so excited about the regional Python conferences that I've been able to attend. + +01:18:51.540 --> 01:18:51.700 +Thomas. + +01:18:53.640 --> 01:18:55.280 +Wow, I didn't expect this. + +01:18:55.380 --> 01:19:06.860 +So, I think, I think I want to give a shout out to new people joining the community and also joining just core developer team as triage or it's just drive by commenters. + +01:19:07.360 --> 01:19:17.140 +I know we harped a little bit about people, you know, giving strong opinions and discussions, but we, we, I always look to the far future as well as the near future. + +01:19:17.340 --> 01:19:19.020 +And we always need new people. + +01:19:19.110 --> 01:19:20.180 +We need new ideas. + +01:19:20.290 --> 01:19:21.180 +We need new opinions. + +01:19:21.480 --> 01:19:26.700 +So yeah, I'm excited that there's still people joining and signing up. + +01:19:26.980 --> 01:19:34.320 +And even when it's thankless work, so I guess I want to say thank you to people doing all the thankless work. + +01:19:36.360 --> 01:19:36.580 +Jodi. + +01:19:37.800 --> 01:19:45.200 +Yeah, I want to say this is actually really only my third year or so really in the Python community. + +01:19:45.460 --> 01:19:47.700 +So before that, I was just sort of on the fringes, right? + +01:19:47.880 --> 01:19:51.600 +And after I started advocacy, I started going to the conferences and meeting people. + +01:19:52.560 --> 01:19:59.160 +And I think I didn't kind of get how special the community was until I watched the Python documentary this year. + +01:19:59.520 --> 01:20:06.620 +And I talked to Paul about this, Paul Everett afterwards, also made fun of him for his like early 2000s fashion. + +01:20:07.340 --> 01:20:16.460 +But I think I'm a relative newcomer to this community and you've all made me feel so welcome. + +01:20:16.900 --> 01:20:27.280 +And I guess I want to thank all the incumbents for everything you've done to make this such a special tech community for minorities and everyone, newbies. + +01:20:28.260 --> 01:20:29.680 +Python is love. + +01:20:33.160 --> 01:20:33.660 +Oh, geez. + +01:20:33.750 --> 01:20:34.760 +How am I supposed to follow that? + +01:20:35.880 --> 01:20:36.860 +A little tear. + +01:20:41.940 --> 01:20:52.180 +Yeah, I think one of the interesting things that we're kind of looping on here is I think the language evolution has slowed down, but it's obviously not stopped, right? + +01:20:52.340 --> 01:20:55.300 +Like as Thomas pointed out, there's a lot more stuff happening behind the scenes. + +01:20:56.320 --> 01:21:02.180 +Lazy imports are coming, and that was a syntactic change, which apparently brings out the mean side of some people. + +01:21:03.960 --> 01:21:06.840 +And we've obviously got our challenges and stuff, but things are still going. + +01:21:07.020 --> 01:21:08.380 +We're still chugging along. + +01:21:08.490 --> 01:21:21.700 +We're still trying to be an open, welcoming place for people like Jody and everyone else who's new coming on over and continue to be a fun place for all of us slightly graying beard people who have been here for a long time to make us want to stick around. + +01:21:24.360 --> 01:21:28.420 +So, yeah, I think it's just more of the same, honestly. + +01:21:28.660 --> 01:21:33.760 +It's all of us just continue to do what we can to help out to keep this community being a great place. + +01:21:33.900 --> 01:21:36.740 +and it all just keeps going forward. + +01:21:37.040 --> 01:21:38.400 +And I'll just end with, + +01:21:38.510 --> 01:21:39.340 +if you work for a company + +01:21:39.440 --> 01:21:40.540 +that's not sponsoring the PSF, + +01:21:40.700 --> 01:21:41.160 +please do so. + +01:21:48.460 --> 01:21:51.800 +So it's rare to have, + +01:21:52.560 --> 01:21:53.660 +I mean, a programming language + +01:21:53.810 --> 01:21:54.860 +or any sort of tool + +01:21:55.900 --> 01:21:57.640 +where it is both really, + +01:21:57.840 --> 01:21:59.400 +really beneficial to your career + +01:22:00.300 --> 01:22:01.380 +and you get to hang out + +01:22:01.540 --> 01:22:02.520 +with really special, + +01:22:02.960 --> 01:22:09.200 +nice, interesting people. And it's easy to take all that for granted if you've been steeped in + +01:22:09.200 --> 01:22:13.640 +the community. I went to a conference about six months ago, a non-Python conference, + +01:22:14.500 --> 01:22:20.240 +and that was shocking to me to discover that all the speakers were from advertisers and sponsors. + +01:22:21.220 --> 01:22:25.700 +Everything was super commercialized. People were not interested in just like hanging out and sharing + +01:22:25.880 --> 01:22:30.399 +with each other. And it was a shock to me because I've been to basically only Python conferences + +01:22:30.420 --> 01:22:35.620 +for so many years, I was like, oh, that's not the norm in the industry. So we've got something + +01:22:35.790 --> 01:22:40.360 +really special going that not only is good for the people, but good for everyone's careers and + +01:22:40.430 --> 01:22:45.180 +mutually reinforcing and helping each other. And that's really fantastic. And we should appreciate + +01:22:45.420 --> 01:22:54.919 +that. Absolutely. Barry, final word. Well, Thomas stole my thunder just a little bit, but just to tie + +01:22:54.940 --> 01:22:56.700 +a couple of these ideas together. + +01:22:58.100 --> 01:23:01.580 +Python, and Brett said this, right? + +01:23:02.240 --> 01:23:03.680 +Python is the community, + +01:23:03.940 --> 01:23:05.420 +or the community is Python. + +01:23:06.060 --> 01:23:10.220 +There's no company that is telling anybody + +01:23:10.980 --> 01:23:12.080 +what Python should be. + +01:23:12.560 --> 01:23:14.000 +Python is what we make it. + +01:23:14.480 --> 01:23:18.020 +And as folks like myself get a little older + +01:23:18.140 --> 01:23:25.800 +and we have younger people coming into the community, + +01:23:26.060 --> 01:23:27.940 +both developers and everything else, + +01:23:28.460 --> 01:23:31.500 +who are shaping Python into their vision. + +01:23:32.180 --> 01:23:35.680 +I encourage you, if you've thought about becoming a core dev, + +01:23:36.260 --> 01:23:37.060 +find a mentor. + +01:23:37.210 --> 01:23:38.920 +There are people out there that will help you. + +01:23:39.010 --> 01:23:44.320 +If you want to be involved in the community, the PSF, reach out. + +01:23:44.390 --> 01:23:47.160 +There are people who will help guide you into this community. + +01:23:47.180 --> 01:23:49.120 +You can be involved. + +01:23:49.620 --> 01:23:58.700 +Do not let any self-imposed limitations stop you from becoming part of the Python community in the way that you want to. + +01:23:59.300 --> 01:24:06.060 +And eventually run for the steering council because we need many, many, many more candidates next year. + +01:24:06.860 --> 01:24:13.300 +And you don't need any qualifications either because I'm a high school dropout and I never went to college or anything. + +01:24:13.920 --> 01:24:14.720 +And look at me. + +01:24:15.740 --> 01:24:15.900 +Amazing. + +01:24:15.980 --> 01:24:17.640 +And I have a PhD, and I will tell you, + +01:24:17.700 --> 01:24:19.620 +I did not need all that to become a Python developer + +01:24:19.880 --> 01:24:22.020 +because of the Python I've heard about before I got the PhD. + +01:24:22.620 --> 01:24:22.820 +So-- + +01:24:24.360 --> 01:24:27.100 +Well, I'm a bass player, so if I can do it, anybody can do it. + +01:24:30.160 --> 01:24:31.580 +Thank you, everyone, for being here. + +01:24:31.900 --> 01:24:33.340 +This awesome look back in the air, + +01:24:33.440 --> 01:24:34.760 +and I really appreciate you all taking the time. + +01:24:35.220 --> 01:24:35.880 +Thank you, Michael. + +01:24:36.040 --> 01:24:36.180 +Thanks. + +01:24:36.560 --> 01:24:36.720 +Thanks. + +01:24:36.720 --> 01:24:37.580 +This was great fun. + +01:24:38.060 --> 01:24:38.360 +Thanks, everybody. + +01:24:38.680 --> 01:24:39.260 +Bye, everybody. + +01:24:39.280 --> 01:24:39.840 +Good to see everybody. + +01:24:40.340 --> 01:24:40.820 +I thought it was. + diff --git a/youtube_transcripts/533-web-frameworks-in-prod-by-their-creators-youtube.vtt b/youtube_transcripts/533-web-frameworks-in-prod-by-their-creators-youtube.vtt new file mode 100644 index 0000000..5e10e32 --- /dev/null +++ b/youtube_transcripts/533-web-frameworks-in-prod-by-their-creators-youtube.vtt @@ -0,0 +1,3713 @@ +WEBVTT + +00:00:03.399 --> 00:00:11.940 +Hello, hello, Carlton, Sebastian, David, Cody, Yannick, Phil, Jeff, welcome back to + +00:00:12.040 --> 00:00:12.920 +Talk Python To Me, all of you. + +00:00:13.920 --> 00:00:14.840 +Thanks for having us. + +00:00:15.440 --> 00:00:16.820 +Yeah, it's... + +00:00:20.140 --> 00:00:23.200 +We're here for what may be my favorite topic, for sure. + +00:00:23.590 --> 00:00:29.380 +Something I spend most of my time on is web API stuff, which is awesome. + +00:00:29.560 --> 00:00:37.200 +I'm so excited to have you here to give your inside look at how people should be running your framework, + +00:00:38.480 --> 00:00:43.620 +at least the one that you significantly contribute to, depending on which framework we're talking about, right? + +00:00:45.340 --> 00:00:50.420 +So it's going to be a lot of fun, and I'm really excited to talk about it. + +00:00:50.620 --> 00:01:02.480 +However, there's an interesting fact that I've been throwing out a lot lately is that fully half of the people doing professional Python development have only been doing it for two years or less. + +00:01:03.739 --> 00:01:05.280 +And some of you have been on the show. + +00:01:05.360 --> 00:01:07.020 +It was maybe two years longer than that. + +00:01:07.480 --> 00:01:12.000 +Let's just do a quick round of introductions for people who don't necessarily know you. + +00:01:12.020 --> 00:01:14.380 +We'll go around the squares here in the screen sharing. + +00:01:14.640 --> 00:01:15.820 +So Carlton, you're up first. + +00:01:16.440 --> 00:01:17.380 +Oh, I get to go first. + +00:01:17.600 --> 00:01:17.680 +Brilliant. + +00:01:17.840 --> 00:01:18.800 +Well, I'm Carlton. + +00:01:18.940 --> 00:01:21.540 +I work on the Django Red framework mostly. + +00:01:21.740 --> 00:01:22.980 +I'm a former Django fellow. + +00:01:23.090 --> 00:01:25.380 +I maintain a number of packages in the ecosystem. + +00:01:25.640 --> 00:01:29.400 +And the last few years, I've been back to building stuff with Django rather than working on it. + +00:01:29.870 --> 00:01:32.720 +So I run a build startup that's, well, we're still going. + +00:01:32.790 --> 00:01:33.820 +So I'm quite excited about that. + +00:01:34.759 --> 00:01:35.200 +Awesome. + +00:01:35.720 --> 00:01:38.520 +How is it to be building with Django than building Django? + +00:01:39.120 --> 00:01:42.000 +Oh, I'm literally having the time of my life. + +00:01:42.180 --> 00:01:44.740 +I spent five years as a Django fellow working on Django. + +00:01:45.160 --> 00:01:48.760 +And I just built up this backlog of things I wanted to do. + +00:01:49.160 --> 00:01:53.580 +And I had no time and no capacity and no sort of nothing to work on with them. + +00:01:53.640 --> 00:01:55.040 +And it's just a delight. + +00:01:55.800 --> 00:01:59.760 +And every day I sit down on my computer thinking, oh, what's today? + +00:01:59.920 --> 00:02:00.740 +I look at the background. + +00:02:00.980 --> 00:02:01.440 +Oh, yes. + +00:02:01.980 --> 00:02:03.820 +And every day a delight. + +00:02:03.980 --> 00:02:06.520 +So I'm still just loving it. + +00:02:06.780 --> 00:02:07.460 +That's awesome. + +00:02:07.600 --> 00:02:14.160 +So more often you're appreciating your former self than cursing your former self for the way you built. + +00:02:17.300 --> 00:02:21.260 +Yeah, that's an interesting one. I think we should move on before I have a chance. + +00:02:21.280 --> 00:02:28.260 +All right. All right. Speaking of building with and for Sebastian, FastAPI. Hello. + +00:02:29.280 --> 00:02:35.940 +Hello. Okay. Intro for the ones that don't know me. I'm Sebastian Ramirez. I created FastAPI. + +00:02:36.580 --> 00:02:40.980 +Yeah, that's pretty much it. And now I have been building a company since the last two years, + +00:02:41.000 --> 00:02:43.240 +FastAPI Cloud to deploy FastAPI. + +00:02:43.250 --> 00:02:47.040 +So I get to drink from funny cups. + +00:02:48.170 --> 00:02:49.660 +JOHN MUELLER: The world's best boss. + +00:02:50.420 --> 00:02:50.860 +Amazing. + +00:02:51.570 --> 00:02:54.020 +So I think you deserve to give a bit of a shout out + +00:02:54.020 --> 00:02:55.020 +to FastAPI Cloud. + +00:02:55.140 --> 00:02:55.640 +That's a big deal. + +00:02:56.440 --> 00:02:57.100 +Thank you. + +00:02:57.240 --> 00:02:57.860 +Thank you very much. + +00:02:58.000 --> 00:02:59.220 +Yeah, it's super fun. + +00:03:00.120 --> 00:03:03.000 +The idea is to make it super simple to deploy FastAPI + +00:03:03.140 --> 00:03:03.460 +applications. + +00:03:03.860 --> 00:03:06.120 +The idea with FastAPI was to make it very simple + +00:03:06.380 --> 00:03:08.520 +to build applications, build APIs. + +00:03:09.760 --> 00:03:14.300 +like get the idea from from idea to product in your record time. + +00:03:14.400 --> 00:03:15.600 +That was the idea with FastAPI. + +00:03:15.820 --> 00:03:19.560 +But then deploying that in many cases is just too conversant. + +00:03:19.560 --> 00:03:20.280 +It's too complicated. + +00:03:20.560 --> 00:03:22.000 +There are just so many things to learn. + +00:03:22.480 --> 00:03:26.760 +So I wanted to bring something for people to be able to say like, hey, just one + +00:03:26.860 --> 00:03:29.080 +command FastAPI deploy and we take care of the rest. + +00:03:29.640 --> 00:03:33.760 +And then we and the team, I have an amazing thing that I've been able to work with. + +00:03:34.400 --> 00:03:38.520 +We suffer all the cloud pain so that people don't have to deal with that. + +00:03:38.640 --> 00:03:43.640 +And yeah, it's painful to build, but it's so cool to use it. + +00:03:43.670 --> 00:03:47.220 +You know, like that's the part when I say like, yes, this was worth it. + +00:03:47.290 --> 00:03:50.140 +When I get to use the thing myself, that is super cool. + +00:03:50.720 --> 00:03:54.240 +Yeah, I'm assuming you build FastAPI Cloud with FastAPI somewhat. + +00:03:55.240 --> 00:03:56.840 +Yes, yes, yes, exactly. + +00:03:57.220 --> 00:03:59.120 +FastAPI Cloud runs on FastAPI Cloud. + +00:03:59.780 --> 00:04:03.740 +And I get just to like now run random things in there and I'm like, yes. + +00:04:05.920 --> 00:04:06.120 +Awesome. + +00:04:06.920 --> 00:04:08.700 +Well, yeah, congrats to that again. + +00:04:08.880 --> 00:04:09.400 +That's super cool. + +00:04:10.120 --> 00:04:10.820 +David Lord, welcome. + +00:04:11.200 --> 00:04:11.480 +Welcome back. + +00:04:12.740 --> 00:04:13.240 +Yeah, hello. + +00:04:14.480 --> 00:04:15.220 +I'm David Lord. + +00:04:15.440 --> 00:04:32.220 +I'm the lead maintainer of Pallets, which is Flask, Jinja, Click, Perixote, It's Dangerous, Markup Safe, and now Pallets Eco, which is a bunch of the famous extensions for Flask that are getting community maintenance now. + +00:04:34.360 --> 00:04:41.120 +I've been doing that since, I think I've been the lead maintainer since like 2019, but a maintainer since like 2017. + +00:04:41.700 --> 00:04:42.360 +So it's been a while. + +00:04:43.820 --> 00:04:46.760 +Yeah, that is, you know, that's been a while. + +00:04:46.920 --> 00:04:49.240 +We're coming up on seven, eight years. + +00:04:49.540 --> 00:04:49.900 +That's crazy. + +00:04:51.440 --> 00:04:51.920 +Time flies. + +00:04:51.960 --> 00:04:54.660 +It's always funny because I always feel like I've been doing it for way, way longer. + +00:04:54.980 --> 00:04:57.720 +And then I look at like the actual date that I got added as a maintainer. + +00:04:57.720 --> 00:04:59.040 +I'm like, well, it couldn't have been that late. + +00:04:59.120 --> 00:05:00.320 +I was doing stuff before that, right? + +00:05:01.620 --> 00:05:05.060 +I'm sure you're deep in Flask before you got added as a maintainer of it, right? + +00:05:06.370 --> 00:05:06.500 +Yeah. + +00:05:07.120 --> 00:05:07.900 +Yeah, yeah. + +00:05:09.780 --> 00:05:15.520 +Let's say welcome to Phil Jones, since you are also on the same org. + +00:05:15.980 --> 00:05:16.080 +Next. + +00:05:16.700 --> 00:05:17.380 +Hey, welcome back. + +00:05:18.020 --> 00:05:19.580 +Hello, yeah, I'm Phil Jones. + +00:05:19.710 --> 00:05:23.320 +I am the author of Quart, which is also part of Palette. + +00:05:23.720 --> 00:05:27.920 +I also work on Berkshagen and Flask and help out there. + +00:05:28.820 --> 00:05:31.520 +And I've done a server called Hypercorn as well. + +00:05:31.610 --> 00:05:34.900 +So a bit of interest in that part of the ecosystem. + +00:05:36.620 --> 00:05:38.020 +So what is Quart for people who don't know? + +00:05:38.880 --> 00:05:42.080 +So Quart is basically Flask with Async await. + +00:05:42.580 --> 00:05:46.120 +And that was the idea behind it really to make it possible to do Async await. + +00:05:47.000 --> 00:05:48.720 +So yeah, that's pretty much it. + +00:05:48.720 --> 00:05:50.980 +If we only manage to merge them, we will. + +00:05:52.040 --> 00:05:58.580 +And the goal now with Quart as part of Palettes is to eventually have it be one code base with Flask. + +00:05:59.780 --> 00:06:04.300 +but given that we both have small children now we're moving a lot slower + +00:06:06.360 --> 00:06:11.220 +having kids is great i have three kids productivity is not a thing that they are known to + +00:06:12.240 --> 00:06:18.160 +imbue on the parents right especially in the early days i want to say phil thank you i've been + +00:06:18.320 --> 00:06:23.940 +running court for a couple of my websites lately and it's been amazing nice yeah we i also use it + +00:06:24.080 --> 00:06:28.640 +work we've got all our stuff in in court which is yeah it's really good fun a bit like yeah yeah + +00:06:28.840 --> 00:06:33.340 +So when people, if they get, if they listen to the show or they go to the website of the + +00:06:33.340 --> 00:06:36.440 +show and they're not on YouTube, then that somehow involves court. + +00:06:37.420 --> 00:06:38.040 +Janek, welcome. + +00:06:40.620 --> 00:06:40.740 +Hey. + +00:06:41.940 --> 00:06:43.160 +Yeah, I'm Janek Nivertny. + +00:06:43.260 --> 00:06:45.120 +I work on Litestar. + +00:06:45.120 --> 00:06:48.140 +I just looked up how long it's been because I was curious myself. + +00:06:49.540 --> 00:06:54.080 +I also had the same feeling that it's been a lot longer than, it's actually only been + +00:06:54.320 --> 00:06:54.740 +three years. + +00:06:56.140 --> 00:06:56.220 +Yeah. + +00:06:56.620 --> 00:06:59.820 +And I also noticed something with all you guys here in the room. + +00:07:00.160 --> 00:07:07.820 +I use almost all of the projects you maintain at work, which is quite nice. + +00:07:07.880 --> 00:07:10.200 +We have a very big Django deployment. + +00:07:10.360 --> 00:07:11.680 +We have some Flask deployments. + +00:07:11.680 --> 00:07:13.300 +We have a few FastAPI deployments. + +00:07:13.780 --> 00:07:18.080 +I think we have one core deployment, and we also have two Light Store deployments, + +00:07:18.980 --> 00:07:21.480 +which obviously is a lot of fun to work with. + +00:07:21.780 --> 00:07:27.760 +But I find it really nice actually to work with all these different things. + +00:07:27.960 --> 00:07:32.080 +It's super interesting also because everything has its own niche + +00:07:32.300 --> 00:07:34.440 +that it's really good at. + +00:07:34.680 --> 00:07:38.860 +And even, you know, you think if you maintain a framework yourself, + +00:07:40.000 --> 00:07:42.400 +you tend to always recommend it for everything. + +00:07:43.200 --> 00:07:44.980 +But I noticed it's not actually true. + +00:07:45.180 --> 00:07:49.100 +There's actually a few cases where I don't recommend Litestar. + +00:07:49.380 --> 00:07:55.220 +recommend just use Django for this, or use Flask for that, + +00:07:55.220 --> 00:07:59.600 +or use FastAPI for this, because they are quite different after all. + +00:08:01.140 --> 00:08:05.420 +And I find that really interesting and nice. + +00:08:05.440 --> 00:08:09.020 +And I think it's a good sign of a healthy ecosystem + +00:08:09.460 --> 00:08:11.880 +if it's not just the same thing but different, + +00:08:12.180 --> 00:08:14.520 +but it actually brings something very unique and different + +00:08:14.520 --> 00:08:15.020 +to the table. + +00:08:16.100 --> 00:08:18.500 +I think that's a great attitude. + +00:08:18.780 --> 00:08:19.360 +It's very interesting. + +00:08:19.760 --> 00:08:22.640 +You know, I feel like there's a lot of people who feel like + +00:08:23.540 --> 00:08:26.100 +they've kind of got to pick their tech team for everything. + +00:08:26.880 --> 00:08:28.100 +You know, I'm going to build a static site. + +00:08:28.230 --> 00:08:30.360 +Like, well, I've got to have a Python-based static site. + +00:08:30.500 --> 00:08:31.420 +Builder like, well, it's a static site. + +00:08:31.450 --> 00:08:33.539 +Who cares what technology makes it turn-- + +00:08:33.900 --> 00:08:35.479 +you're writing Markdown, out comes HTML. + +00:08:35.840 --> 00:08:37.479 +Who cares what's in the middle, for example, right? + +00:08:38.080 --> 00:08:42.039 +And I feel like that's kind of a life lessons learned. + +00:08:43.039 --> 00:08:43.539 +There, I did. + +00:08:44.180 --> 00:08:44.900 +Yeah, that's awesome. + +00:08:45.860 --> 00:08:46.140 +Cody. + +00:08:47.140 --> 00:08:47.380 +Hello, hello. + +00:08:48.180 --> 00:08:49.600 +Yeah, hey guys, I'm Cody Fincher. + +00:08:49.780 --> 00:08:51.380 +I'm also one of the maintainers of Litestar. + +00:08:51.680 --> 00:08:54.240 +I've been there just a little bit longer than Yannick. + +00:08:55.319 --> 00:08:57.220 +And so it's been about four years now. + +00:08:58.439 --> 00:09:00.240 +And Yannick actually teed this up perfectly + +00:09:00.260 --> 00:09:01.720 +'cause I was gonna say something very similar. + +00:09:02.120 --> 00:09:03.180 +I currently work for Google. + +00:09:03.280 --> 00:09:04.800 +I've been there for about three and a half years now. + +00:09:05.080 --> 00:09:07.840 +And we literally have every one of the frameworks + +00:09:07.920 --> 00:09:09.820 +you guys just mentioned, and they're all in production. + +00:09:10.040 --> 00:09:14.620 +And so one of the things that you'll see on the Litestar org + +00:09:14.780 --> 00:09:16.439 +and part of the projects that we maintain + +00:09:16.460 --> 00:09:18.380 +that we have these optional batteries + +00:09:18.720 --> 00:09:20.500 +and most of the batteries that we have + +00:09:20.700 --> 00:09:22.680 +all work with the frameworks for you guys. + +00:09:22.790 --> 00:09:26.360 +And so it's nice to be able to use that stuff, + +00:09:26.650 --> 00:09:29.120 +you know, regardless of what tooling you've got + +00:09:29.280 --> 00:09:31.100 +or what, you know, what project it is. + +00:09:31.130 --> 00:09:34.080 +And so, yeah, having that interoperability + +00:09:34.340 --> 00:09:36.080 +and the ability to kind of go between the frameworks + +00:09:36.320 --> 00:09:38.740 +that work the best for the right situation is crucial. + +00:09:38.870 --> 00:09:40.920 +And so I'm glad you mentioned that, Yannick. + +00:09:40.950 --> 00:09:42.040 +But yeah, nice to see you guys + +00:09:42.130 --> 00:09:43.020 +and nice to be back on the show. + +00:09:43.960 --> 00:09:45.160 +Cody, tell people what Litestar is. + +00:09:45.300 --> 00:09:48.820 +I know I had both you guys and Jacob on a while ago, + +00:09:48.840 --> 00:09:49.980 +but it's been a couple of years, I think. + +00:09:50.620 --> 00:09:53.460 +Yeah, so, I mean, Litestar, at its core, + +00:09:53.660 --> 00:09:56.540 +is really a web framework that kind of sits somewhere in between, + +00:09:56.800 --> 00:10:00.720 +I'd say, Flask and FastAPI and Django. + +00:10:00.900 --> 00:10:04.940 +So whereas Flask doesn't really bundle a lot of batteries, + +00:10:05.100 --> 00:10:08.860 +there's a huge amount of third-party libraries and ecosystem + +00:10:09.240 --> 00:10:10.740 +that's built around it that people can add into it, + +00:10:10.740 --> 00:10:13.719 +but there's not really, like, for instance, a database adapter + +00:10:13.740 --> 00:10:16.900 +or a database plugin or plugins for VEAT + +00:10:16.930 --> 00:10:17.960 +or something like that, right? + +00:10:18.060 --> 00:10:19.000 +For front end development. + +00:10:19.090 --> 00:10:22.440 +And so what we have been doing is building a API framework + +00:10:22.620 --> 00:10:25.680 +that is very similar in concept to FastAPI + +00:10:25.920 --> 00:10:27.200 +that is also extensible. + +00:10:27.270 --> 00:10:29.700 +So if you want to use the batteries, they're there for you. + +00:10:30.460 --> 00:10:32.160 +But if you don't want to use them, you don't have to, right? + +00:10:32.280 --> 00:10:35.840 +And so a lot of the tooling that we built for LightSolar + +00:10:35.980 --> 00:10:39.360 +was birthed out of a startup that I was in + +00:10:39.700 --> 00:10:40.860 +prior to joining Google. + +00:10:41.050 --> 00:10:43.700 +And so having all this boilerplate + +00:10:43.960 --> 00:10:44.740 +It needed somewhere to go. + +00:10:44.880 --> 00:10:47.540 +And so a lot of this stuff ended up being plugins, + +00:10:47.880 --> 00:10:50.040 +which is what we bundled into Litestar + +00:10:50.180 --> 00:10:52.660 +so that you can kind of add in this extra functionality. + +00:10:53.980 --> 00:10:55.660 +And so I know I'm getting long-winded. + +00:10:55.820 --> 00:10:59.160 +It's somewhere between Django and Flask, + +00:10:59.200 --> 00:11:01.100 +if you were to think about it in terms of a spectrum, + +00:11:01.160 --> 00:11:04.680 +in terms of what it gives you in terms of a web framework. + +00:11:04.960 --> 00:11:07.160 +But in short, it does everything that all the other guys do. + +00:11:08.119 --> 00:11:09.600 +Yeah, very neat, very neat. + +00:11:10.160 --> 00:11:11.720 +It's definitely a framework I admire. + +00:11:12.600 --> 00:11:14.480 +Jeff Triplett, so glad you could make it. + +00:11:15.680 --> 00:11:16.440 +Yeah, thanks for having me. + +00:11:16.930 --> 00:11:17.780 +Yeah, I'm Jeff Triplett. + +00:11:17.780 --> 00:11:18.660 +I'm out of Lawrence, Kansas. + +00:11:19.080 --> 00:11:21.180 +I'm a consultant at a company called Revolution Systems. + +00:11:21.930 --> 00:11:24.720 +I was on, some people know me from being on the Python Software Foundation board. + +00:11:25.080 --> 00:11:26.200 +I've been off that for a few years. + +00:11:27.460 --> 00:11:30.140 +As of last week, I'm the president of the Django Software Foundation. + +00:11:30.490 --> 00:11:31.800 +So I've been on that board for a year. + +00:11:32.340 --> 00:11:34.240 +I'm kind of a Django power user, I guess. + +00:11:34.340 --> 00:11:35.520 +I've used it for about 20 years. + +00:11:36.300 --> 00:11:39.920 +And I've kind of not really worked on, I don't even think I have a patch anymore in Django. + +00:11:40.820 --> 00:11:42.660 +but I've done a lot with the community, + +00:11:43.570 --> 00:11:45.800 +done a lot with contributing through conferences + +00:11:46.620 --> 00:11:47.500 +and using utilities. + +00:11:48.190 --> 00:11:51.120 +I try to promote Carleton's applications like Neapolitan. + +00:11:52.080 --> 00:11:54.120 +If I like tools, Python tools in general, + +00:11:54.250 --> 00:11:55.260 +I try to advocate for it. + +00:11:56.140 --> 00:11:59.640 +Like Yannick, I've also used all of these applications. + +00:11:59.920 --> 00:12:00.980 +Litestar, I haven't, but I have a friend + +00:12:01.010 --> 00:12:02.040 +who talks about it a lot. + +00:12:02.430 --> 00:12:04.020 +And so I feel like I know a lot from it. + +00:12:04.600 --> 00:12:06.960 +As a consultant, we tend to go with the best tool for the job. + +00:12:07.070 --> 00:12:08.300 +So I've done a little bit of FastAPI. + +00:12:08.820 --> 00:12:10.320 +I worked with Flask a lot over the years. + +00:12:10.820 --> 00:12:12.580 +even though we're primarily a Django shop. + +00:12:13.430 --> 00:12:14.660 +It just depends on what the client needs. + +00:12:15.300 --> 00:12:18.800 +Yeah, and you see a lot of different sizes of web app deployments, + +00:12:18.960 --> 00:12:21.020 +so I think that's going to be an interesting angle for sure. + +00:12:21.920 --> 00:12:22.500 +Yeah, absolutely. + +00:12:22.800 --> 00:12:25.140 +Small ones to hundreds of servers. + +00:12:25.540 --> 00:12:27.680 +We don't see it as much anymore the last four or five years, + +00:12:28.540 --> 00:12:30.120 +especially with CDNs and caching. + +00:12:30.900 --> 00:12:33.780 +We just don't see load like we did 10 years ago or so. + +00:12:34.500 --> 00:12:36.320 +And then I also do a lot of small, + +00:12:36.730 --> 00:12:38.620 +I kind of call some of them little dumb projects, + +00:12:38.790 --> 00:12:39.540 +but some are just fun. + +00:12:40.120 --> 00:12:44.740 +Like I've got a FastAPI WebRing that I wrote a year ago for April Fool's Day. + +00:12:44.900 --> 00:12:48.160 +And for some reason, that kind of took off and people liked it, even though it was kind + +00:12:48.160 --> 00:12:48.480 +of a joke. + +00:12:49.100 --> 00:12:51.160 +So I started like peppering it on a bunch of sites. + +00:12:51.310 --> 00:12:52.940 +And I maintain like Django packages. + +00:12:53.330 --> 00:12:55.120 +I do a newsletter, Django News Newsletter. + +00:12:55.980 --> 00:12:56.920 +Just kind of lots of fun stuff. + +00:12:57.580 --> 00:12:58.380 +Yeah, awesome. + +00:12:59.320 --> 00:13:01.520 +Well, definitely looking forward to hearing all of your opinions. + +00:13:02.560 --> 00:13:06.760 +So I've got a bunch of different Your App in production topics. + +00:13:07.020 --> 00:13:10.300 +I thought we could just work around and talk over. + +00:13:11.180 --> 00:13:13.180 +So I thought maybe the first one is, + +00:13:13.840 --> 00:13:16.900 +what would you recommend, or if you don't really + +00:13:16.900 --> 00:13:18.160 +have a strong recommendation, what + +00:13:18.200 --> 00:13:21.860 +would you choose for yourself to put your app + +00:13:22.180 --> 00:13:24.120 +in your framework in production? + +00:13:24.360 --> 00:13:27.040 +I'm thinking app servers, reverse proxies + +00:13:27.040 --> 00:13:28.400 +like Nginx or Caddy. + +00:13:29.760 --> 00:13:31.080 +Do you go for threaded-- + +00:13:31.620 --> 00:13:32.960 +try to scale out with threads. + +00:13:33.160 --> 00:13:38.240 +You try to scale out with processes, you know, Docker, no Docker, Kubernetes. + +00:13:39.600 --> 00:13:40.380 +What are we doing here, folks? + +00:13:41.600 --> 00:13:41.900 +Carlton. + +00:13:42.460 --> 00:13:43.840 +I think we'll just keep going around the circle here. + +00:13:44.000 --> 00:13:47.460 +So you may get the first round of everyone. + +00:13:47.580 --> 00:13:48.300 +No, I'll try to mix it up. + +00:13:48.400 --> 00:13:52.320 +Okay, so I do the oldest school thing in the book. + +00:13:52.390 --> 00:13:56.280 +I run Nginx as my front end. + +00:13:56.580 --> 00:14:00.480 +I'll stick a WSGI server behind it with a pre-fork, you know, a few workers, + +00:14:00.860 --> 00:14:04.740 +depending on CPU size, depending on the kind of requests I'm handling. + +00:14:05.440 --> 00:14:09.040 +These days, in order to handle long-lived requests, + +00:14:09.140 --> 00:14:11.540 +like server-sent events, that kind of WebSocket type things, + +00:14:11.620 --> 00:14:13.580 +I'll run an ASGII server as a kind of sidecar. + +00:14:14.759 --> 00:14:16.740 +Okay. I've been thinking about this a lot, actually. + +00:14:17.160 --> 00:14:18.020 +But yeah, this is interesting. + +00:14:18.300 --> 00:14:22.120 +If you're running a small site and you want long-lived requests, + +00:14:22.160 --> 00:14:23.740 +just run ASGII. Just use ASGII. + +00:14:23.880 --> 00:14:27.240 +Because any of the servers, Hypercorn, Uvicorn, Daphne, + +00:14:27.660 --> 00:14:31.200 +Well, Grannion is the new hot kid on the bot, right? + +00:14:31.680 --> 00:14:33.740 +All of those will handle your traffic, no problem. + +00:14:34.020 --> 00:14:37.680 +But for me, the scaling paddles in Whiskey are so well-known + +00:14:38.080 --> 00:14:40.920 +and just, like, I can do the maths on the back of the pencil. + +00:14:40.980 --> 00:14:44.120 +I know exactly how to scale it up, having done it for so long. + +00:14:44.720 --> 00:14:46.560 +For me, for my core application, + +00:14:46.860 --> 00:14:48.460 +I would still rather use the WSGI server + +00:14:48.500 --> 00:14:53.880 +and then limit the async stuff to just to the use cases + +00:14:54.080 --> 00:14:55.360 +where it's particularly suited. + +00:14:55.840 --> 00:14:56.520 +So we'll do that. + +00:14:57.340 --> 00:14:59.700 +Process Manager, I deploy using Systemd. + +00:15:00.999 --> 00:15:04.020 +If I want a container, I'll use Podman via Systemd. + +00:15:05.260 --> 00:15:06.620 +It's as old school as it gets. + +00:15:06.800 --> 00:15:10.440 +I'll very often run a Redis instance on local hosts for caching, + +00:15:10.660 --> 00:15:12.520 +and that will be it. + +00:15:13.020 --> 00:15:14.300 +And that will get me an awful long way. + +00:15:15.639 --> 00:15:16.900 +Just get a bigger box. + +00:15:17.620 --> 00:15:18.480 +Yeah, yeah, yeah. + +00:15:19.040 --> 00:15:21.580 +If you really, really, really need multiple boxes, + +00:15:21.720 --> 00:15:22.420 +well, then we'll talk. + +00:15:22.740 --> 00:15:24.620 +I feel like you and I are in a similar vibe. + +00:15:25.040 --> 00:15:26.900 +But one thing I want to sort of throw out there to you, + +00:15:27.020 --> 00:15:28.520 +but also sort of the others is, + +00:15:29.400 --> 00:15:30.360 +what are we talking with databases? + +00:15:30.880 --> 00:15:33.140 +Like, who's bold enough to go SQLite? + +00:15:33.440 --> 00:15:34.740 +Anyone's going SQLite out there? + +00:15:36.200 --> 00:15:36.500 +Yeah. + +00:15:36.780 --> 00:15:37.540 +It depends, right? + +00:15:38.140 --> 00:15:39.560 +It just depends on what you're doing, right? + +00:15:39.560 --> 00:15:41.580 +And how many concurrent users you're going to have. + +00:15:42.080 --> 00:15:42.900 +It really depends on there. + +00:15:43.860 --> 00:15:49.720 +The Palette's website is running on Flask, + +00:15:50.040 --> 00:15:50.940 +which I wasn't doing for a while. + +00:15:50.940 --> 00:15:52.280 +I was doing a static site generator. + +00:15:52.900 --> 00:15:57.100 +Then I got inspired by Andrew Godwin's static dynamic sites. + +00:15:58.000 --> 00:16:03.460 +And so it loads up all these markdown files, static markdown files into a SQLite database + +00:16:04.880 --> 00:16:08.680 +at runtime and then serves off of that because you can query really fast. + +00:16:09.080 --> 00:16:09.820 +Oh, that's awesome. + +00:16:09.900 --> 00:16:10.300 +I love it. + +00:16:11.120 --> 00:16:11.760 +So yeah, that's cool. + +00:16:11.800 --> 00:16:13.080 +SQLite for the balance website. + +00:16:14.080 --> 00:16:14.560 +Awesome. + +00:16:14.680 --> 00:16:21.899 +Also, do you have a few small apps that use SQLite and one recently that's Cody's fault + +00:16:21.940 --> 00:16:23.480 +because he put me on that track + +00:16:24.520 --> 00:16:28.100 +where it's running a SQLite database in the browser + +00:16:28.300 --> 00:16:32.160 +because it's really, nowadays, it's quite easy to do that. + +00:16:33.100 --> 00:16:35.660 +And then you can do all sorts of stuff with it, + +00:16:35.860 --> 00:16:40.340 +like hook into it with DuckDB and perform some analysis. + +00:16:40.580 --> 00:16:44.280 +So you don't actually need to run any sort of server at all. + +00:16:44.320 --> 00:16:48.040 +You can just throw some files into Nginx and serve your data. + +00:16:49.240 --> 00:16:52.140 +As long as that static, you have a super, super simple deployment. + +00:16:53.200 --> 00:16:55.740 +So yeah, definitely SQLite if you can. + +00:16:56.540 --> 00:16:56.900 +I like it. + +00:16:57.180 --> 00:16:58.220 +I agree. + +00:16:58.400 --> 00:16:58.760 +It's interesting. + +00:16:58.940 --> 00:17:01.000 +Like the database probably won't go down with that. + +00:17:01.780 --> 00:17:01.960 +Probably. + +00:17:02.879 --> 00:17:04.260 +Let's do this by framework. + +00:17:04.400 --> 00:17:06.579 +So we'll do vertical slices in the visual here. + +00:17:06.839 --> 00:17:07.280 +So Jeff. + +00:17:09.400 --> 00:17:11.980 +Yeah, Django, Postgres, pretty old school stack. + +00:17:12.819 --> 00:17:15.540 +I think putting a CD in front of anything is just a win. + +00:17:15.620 --> 00:17:18.800 +So whether you like Fastly or Cloudflare, you get a lot of mileage out of it. + +00:17:18.900 --> 00:17:22.380 +You learn a lot about caching because it's kind of hard to cache Django by default. + +00:17:22.680 --> 00:17:25.740 +So you get to play with curl and kind of figure out why very headers are there. + +00:17:26.000 --> 00:17:28.720 +And it's a good learning experience to get through that. + +00:17:29.320 --> 00:17:33.420 +I also like Coolify, which is kind of new, at least new to me and new to Michael. + +00:17:33.520 --> 00:17:35.060 +We talked about this in our spare time a lot. + +00:17:35.680 --> 00:17:39.000 +It's just kind of a boring service that will launch a bunch of containers for you. + +00:17:39.620 --> 00:17:41.080 +There's a bunch of one-click installs. + +00:17:41.200 --> 00:17:42.380 +So Postgres is a one-click. + +00:17:42.620 --> 00:17:45.580 +It also does backups for you, which is really nice to have for free. + +00:17:46.240 --> 00:17:49.520 +I run a couple dozen sites through it and really like it. + +00:17:49.850 --> 00:17:51.140 +You can either do a hosted forum. + +00:17:51.270 --> 00:17:52.480 +I don't get any money from it, + +00:17:52.480 --> 00:17:54.160 +or you can run the open source version. + +00:17:54.540 --> 00:17:55.120 +I do both. + +00:17:55.290 --> 00:17:57.100 +I've got like a home lab that I just run stuff + +00:17:57.330 --> 00:17:58.480 +using the open source version. + +00:17:59.060 --> 00:18:01.180 +For five bucks a month, it's worth it to run a couple servers. + +00:18:01.990 --> 00:18:03.040 +Yeah, Coolify is great. + +00:18:03.040 --> 00:18:03.780 +You can just scale up. + +00:18:05.920 --> 00:18:07.460 +Yeah, it's got a bunch of one-click deploy + +00:18:07.820 --> 00:18:10.120 +for self-hosted SaaS things as well. + +00:18:10.300 --> 00:18:13.460 +I want an analytics stack of containers + +00:18:13.680 --> 00:18:14.940 +that run in its own isolated bit. + +00:18:14.970 --> 00:18:16.080 +Just click here and go. + +00:18:16.120 --> 00:18:17.520 +So it's, yeah. + +00:18:17.820 --> 00:18:18.000 +Yeah. + +00:18:18.380 --> 00:18:20.160 +One click, it's installed and you're up. + +00:18:20.380 --> 00:18:26.160 +And then once you get one Django, Blast, FastAPI site working with it, and it uses like a Docker container. + +00:18:27.160 --> 00:18:34.200 +Once you get that set up, it's really easy to just kind of duplicate that site, plug it in the GitHub or whatever your Git provider is. + +00:18:34.280 --> 00:18:40.000 +And it's a nice experience for what normally is just R-syncing files and life's too short for that. + +00:18:41.140 --> 00:18:41.380 +Yep. + +00:18:41.980 --> 00:18:42.280 +All right. + +00:18:42.660 --> 00:18:45.180 +Sebastian, I want to have you go last on this one + +00:18:45.180 --> 00:18:48.200 +because I think you've got something pretty interesting + +00:18:48.220 --> 00:18:50.300 +with FastAPI Cloud to dive into. + +00:18:50.520 --> 00:18:52.140 +But let's do Litestar next. + +00:18:52.460 --> 00:18:52.560 +Cody. + +00:18:54.340 --> 00:18:58.260 +So I have actually bought all the way in on Gradient. + +00:18:58.520 --> 00:19:02.060 +So for the ASCII server, I've actually been running Gradient + +00:19:02.100 --> 00:19:03.900 +now for, I'd say, a year in production. + +00:19:04.519 --> 00:19:05.820 +It's worked pretty well. + +00:19:06.740 --> 00:19:09.660 +There's a couple of new things that I'm actually kind of experimenting with. + +00:19:09.700 --> 00:19:11.000 +I don't know how well they're going to work out. + +00:19:11.000 --> 00:19:12.160 +So I'm going to go ahead and throw this out there. + +00:19:12.340 --> 00:19:15.220 +but Granian is one of the few ASI servers + +00:19:15.380 --> 00:19:17.080 +that supports HTTP/2, + +00:19:17.780 --> 00:19:20.500 +and it actually can do HTTP/2 clear text. + +00:19:20.640 --> 00:19:22.820 +And so this is part of the next thing I'm gonna say. + +00:19:23.020 --> 00:19:23.900 +Because I work for Google, + +00:19:24.120 --> 00:19:27.660 +I'm actively using lots of Kubernetes and Cloud Run mainly. + +00:19:27.880 --> 00:19:29.580 +And so most of the things that I deploy + +00:19:29.780 --> 00:19:31.880 +are containerized on Cloud Run. + +00:19:32.020 --> 00:19:33.700 +And I typically would suggest + +00:19:34.360 --> 00:19:35.740 +if you're not using something like Systemd + +00:19:35.900 --> 00:19:37.200 +and deploying it directly on bare metal, + +00:19:37.740 --> 00:19:39.859 +then you are gonna wanna let the container + +00:19:39.880 --> 00:19:42.140 +or whatever you're using to manage your processes, + +00:19:43.100 --> 00:19:44.220 +manage that and spin that up. + +00:19:44.280 --> 00:19:47.060 +And so I typically try to allocate, you know, + +00:19:47.060 --> 00:19:48.600 +like one CPU for the container + +00:19:48.740 --> 00:19:51.480 +and let the actual framework scale it up and down as needed. + +00:19:52.740 --> 00:19:55.840 +Cloud Run itself has like an ingress, + +00:19:56.640 --> 00:19:57.980 +like a load balancer that sits in front + +00:19:58.180 --> 00:19:59.200 +that it automatically configures. + +00:19:59.240 --> 00:20:02.940 +And you're required to basically serve up clear text traffic + +00:20:03.260 --> 00:20:04.600 +in when you run Cloud Run. + +00:20:05.640 --> 00:20:08.899 +And because now Gradient supports HTTP2 + +00:20:08.920 --> 00:20:11.460 +And Cloud Run supports HTTP2 clear text. + +00:20:11.510 --> 00:20:15.100 +You can now serve Gradient as HTTP2 traffic. + +00:20:15.350 --> 00:20:18.680 +The good thing about that is that you get an unlimited upload size. + +00:20:18.710 --> 00:20:22.380 +And so there are max thresholds to what you can upload into the various cloud environments. + +00:20:23.180 --> 00:20:26.820 +HTTP2 usually circumvents that or gets around it because of the way the protocol works. + +00:20:26.890 --> 00:20:29.920 +And so you get additional features and functionality because of that. + +00:20:30.060 --> 00:20:31.800 +So anyway, that's what I typically do. + +00:20:32.000 --> 00:20:38.660 +And most of my databases are usually Postgres, AlloyDB, if it needs to be something that's on the analytical side. + +00:20:38.860 --> 00:20:39.920 +So yeah, cool. + +00:20:40.390 --> 00:20:41.960 +Yeah, I'm on Team Granian as well. + +00:20:41.960 --> 00:20:43.420 +I think that's a super neat framework. + +00:20:43.600 --> 00:20:46.440 +I had Giovanni on who's behind it a while ago. + +00:20:47.420 --> 00:20:53.640 +And it seems like it's not as popular, + +00:20:54.060 --> 00:20:56.100 +but it's based on Hyper from the Rust world, which + +00:20:56.280 --> 00:21:00.040 +has like 130,000 projects based on it or something. + +00:21:00.160 --> 00:21:04.220 +So at its core, it's still pretty battle-tested. + +00:21:05.900 --> 00:21:06.980 +Yannick, how about you? + +00:21:08.020 --> 00:21:09.180 +You got a variety, it sounds like. + +00:21:09.720 --> 00:21:10.680 +Yeah, definitely. + +00:21:11.800 --> 00:21:15.640 +There's a pretty clear split between what I do at work + +00:21:15.940 --> 00:21:18.880 +and what I do outside of that. + +00:21:18.890 --> 00:21:22.260 +So at work, it's Kubernetes deployments, + +00:21:22.480 --> 00:21:26.220 +and we manage that pretty much the same way that Cody described. + +00:21:26.420 --> 00:21:31.180 +So it's one or two processors per pod max, + +00:21:31.520 --> 00:21:33.960 +so you can have Kubernetes scale it + +00:21:34.020 --> 00:21:36.160 +or even manually easily scale it up. + +00:21:36.280 --> 00:21:41.620 +You can just go into Kubernetes and say, okay, do me one to five more pods or whatever. + +00:21:42.180 --> 00:21:42.740 +And don't have to worry. + +00:21:42.960 --> 00:21:44.660 +You don't have to start calculating, whatever. + +00:21:45.280 --> 00:21:56.580 +Most of the stuff we run nowadays with UVcorn, our Django deployment up until I think three months ago or so was running on the Unicorn. + +00:21:56.760 --> 00:21:57.860 +But we switched that actually. + +00:21:58.860 --> 00:22:00.900 +And it's been a really great experience. + +00:22:01.800 --> 00:22:06.920 +I think we tried that a year ago, and it didn't work out quite so well. + +00:22:06.980 --> 00:22:12.520 +There were some things that didn't work as expected or didn't perform great, + +00:22:12.800 --> 00:22:17.540 +or Django was throwing some errors, or UVicorn was throwing some errors. + +00:22:17.640 --> 00:22:22.640 +And then, apparently, all of that got fixed, because now it runs without any issue. + +00:22:23.080 --> 00:22:23.420 +Great. + +00:22:24.580 --> 00:22:29.400 +Yeah, for people who don't know, the vibe used to be run G Unicorn, + +00:22:29.440 --> 00:22:32.400 +but then with UVicorn workers, if you're doing async stuff. + +00:22:32.620 --> 00:22:35.560 +And then UVicorn kind of stepped up its game + +00:22:35.600 --> 00:22:38.820 +and said, you can actually treat us as our own app server. + +00:22:39.140 --> 00:22:40.700 +We will manage lifecycle and stuff. + +00:22:40.800 --> 00:22:42.360 +And so that's the path you took, right? + +00:22:42.980 --> 00:22:43.600 +Yeah, exactly. + +00:22:43.840 --> 00:22:44.240 +Before that. + +00:22:44.580 --> 00:22:46.160 +Well, no, actually, before that, we didn't + +00:22:46.340 --> 00:22:48.900 +because our Django is fully synchronous. + +00:22:49.040 --> 00:22:50.880 +It doesn't do any async. + +00:22:51.080 --> 00:22:53.720 +So it was just bare metal unicorn. + +00:22:53.840 --> 00:22:55.880 +And it's still synchronous. + +00:22:56.180 --> 00:22:57.780 +We're just running it under UVicorn. + +00:22:58.980 --> 00:23:02.940 +But interestingly, it's still quite a bit faster in a few cases. + +00:23:03.560 --> 00:23:07.980 +We tried that out and we low-tested it in a couple of scenarios + +00:23:08.130 --> 00:23:10.540 +and we found that it makes a lot of sense. + +00:23:10.930 --> 00:23:14.980 +But outside of that, I do have a lot of, well, + +00:23:15.720 --> 00:23:19.200 +very simplistic deployments that are also just systemd + +00:23:19.480 --> 00:23:24.479 +and a couple of Docker compose files and containers + +00:23:24.500 --> 00:23:31.260 +that are managed through some old coupled together Ansible things. + +00:23:32.520 --> 00:23:36.300 +But I think the oldest one that I have still running is from 2017, + +00:23:37.180 --> 00:23:41.320 +and it's been running without a change for like four or five years. + +00:23:41.820 --> 00:23:42.520 +That is awesome. + +00:23:42.980 --> 00:23:46.900 +I don't see a reason to do anything about it because the app works. + +00:23:47.120 --> 00:23:48.720 +It's being used productively. + +00:23:50.840 --> 00:23:52.820 +So why change anything about that? + +00:23:52.940 --> 00:23:53.560 +No need to introduce. + +00:23:54.680 --> 00:23:55.580 +Just don't touch it. + +00:23:56.020 --> 00:23:59.940 +Yeah, I was actually looking into Coolify that you two guys mentioned. + +00:24:00.820 --> 00:24:03.500 +I was thinking about maybe upgrading it to that, + +00:24:03.680 --> 00:24:06.980 +but I played around with it and I thought, well, why? + +00:24:08.140 --> 00:24:12.720 +It doesn't, you know, if I have to look into that deployment maybe once a year. + +00:24:13.180 --> 00:24:17.960 +So that's really nothing to gain for me to make it more complicated. + +00:24:18.920 --> 00:24:21.260 +Right. Awesome. + +00:24:22.580 --> 00:24:24.100 +David, Team Flask. + +00:24:27.980 --> 00:24:29.660 +I was trying to pull up a bunch of links really quick. + +00:24:32.160 --> 00:24:34.680 +So I mentioned this before the show started, + +00:24:35.920 --> 00:24:39.700 +and I think I'm pretty sure I've said this the last time I was on Talk Python, + +00:24:40.420 --> 00:24:48.100 +but the projects I do for work typically have less than 100 users. + +00:24:48.980 --> 00:24:51.840 +And so my deployment is usually really simple. + +00:24:52.120 --> 00:24:55.120 +And usually they've chosen like Azure or AWS already. + +00:24:55.520 --> 00:25:03.720 +So we just have a Docker container and we put it on the relevant Docker container host and it just works for them. + +00:25:04.440 --> 00:25:07.120 +We have a Postgres database and we have like Redis. + +00:25:09.140 --> 00:25:15.740 +But I never really had to deal with like scaling or that sort of stuff. + +00:25:16.040 --> 00:25:23.860 +But the funny thing is, at least for my work, we're often replacing older systems. + +00:25:24.170 --> 00:25:37.880 +And so even a single Docker container running a Flask application is way more performant and responsive than anything they're used to from some 20-year-old or 30-year-old Java system. + +00:25:39.020 --> 00:25:44.720 +And it can just respond on a small container with a little bit of CPU and a little bit of memory. + +00:25:45.060 --> 00:25:47.940 +They're always shocked at like, how much do we need to pay for? + +00:25:48.100 --> 00:25:50.340 +Oh, just like it'll run a potato. + +00:25:51.260 --> 00:25:52.420 +You know, there's only 100 users. + +00:25:52.620 --> 00:25:53.760 +And they're like, that's a lot of users. + +00:25:56.639 --> 00:26:01.860 +So my recommendation is always start small and then scale up from there. + +00:26:02.020 --> 00:26:04.680 +Don't try to overthink it ahead of time. + +00:26:07.419 --> 00:26:11.620 +But yeah, for my personal stuff, I'm using like Docker containers now and fly.io. + +00:26:11.760 --> 00:26:18.460 +I haven't gotten to it. I do want to look into Granian and coolify, but I haven't gotten there yet. + +00:26:21.440 --> 00:26:26.480 +And for the Docker container, I can definitely recommend pythonspeed.org. I don't remember off + +00:26:26.500 --> 00:26:31.940 +the top of my head who writes that, but it's somebody in the Python ecosystem. And they have + +00:26:32.040 --> 00:26:36.480 +a whole series of articles on how to optimize your Docker container. And that sounds really + +00:26:36.640 --> 00:26:41.720 +complicated, but you end up with a Docker file that's like 20 lines long or something. It's not + +00:26:43.520 --> 00:26:44.480 +there's crazy things. + +00:26:44.590 --> 00:26:46.660 +It's just you have to know how to structure it. + +00:26:46.960 --> 00:26:48.620 +And then I just copy and paste that to the next project. + +00:26:49.020 --> 00:26:49.580 +Nice, yeah. + +00:26:50.030 --> 00:26:51.980 +I resisted doing Docker for a long time because I'm like, + +00:26:52.010 --> 00:26:53.100 +I don't want that extra complexity. + +00:26:53.310 --> 00:26:56.020 +But then I realized, like, the stuff you put in the Docker file + +00:26:56.200 --> 00:26:58.100 +is really what you just type in the terminal once, + +00:26:58.130 --> 00:26:59.520 +and then you forget it. + +00:26:59.600 --> 00:27:01.280 +Or type the Ansible, or, yeah, yeah. + +00:27:02.760 --> 00:27:07.380 +And so, like, I'm always using Postgres, Redis, + +00:27:07.680 --> 00:27:09.280 +probably if I need, like, some background tasks, + +00:27:09.620 --> 00:27:12.080 +just plain SMTP server for email. + +00:27:13.960 --> 00:27:17.120 +I wrote, for all three of those things, + +00:27:17.160 --> 00:27:19.340 +I wrote new extensions in the Flask ecosystem + +00:27:19.660 --> 00:27:21.640 +that I'm trying to get more people to know about now. + +00:27:22.240 --> 00:27:25.680 +So Flask SQLAlchemy Lite, L-I-T-E, + +00:27:26.240 --> 00:27:27.460 +instead of Flask SQLAlchemy, + +00:27:28.560 --> 00:27:29.760 +it takes a much more lightweight approach + +00:27:30.000 --> 00:27:32.220 +to integrating SQLAlchemy with Flask. + +00:27:33.060 --> 00:27:33.660 +Yeah, awesome. + +00:27:34.540 --> 00:27:35.560 +And then Flask Redis, + +00:27:35.880 --> 00:27:39.580 +I revived from like 10 years of non-maintenance + +00:27:41.060 --> 00:27:49.780 +And then I wrote this whole system, this whole pluggable email system called Email Simplified, kind of inspired by Django, Django's pluggable system. + +00:27:51.500 --> 00:27:55.080 +And so there's like Flask Email Simplified to integrate that with Flask. + +00:27:55.280 --> 00:28:01.120 +But unlike Django, you can use Email Simplified in any library you're writing, in any Python application you're writing. + +00:28:01.160 --> 00:28:02.800 +It doesn't have to be a Flask web framework. + +00:28:03.620 --> 00:28:05.280 +It's pluggable as the library itself. + +00:28:06.020 --> 00:28:08.600 +And then you can also integrate it with Flask or something else. + +00:28:09.240 --> 00:28:09.360 +Cool. + +00:28:09.460 --> 00:28:11.120 +So Flask email simplified. + +00:28:12.480 --> 00:28:15.220 +I've been doing email right now, so it needs some popularity. + +00:28:15.500 --> 00:28:15.900 +Awesome. + +00:28:16.120 --> 00:28:18.400 +I've been doing the non-simplified email lately, + +00:28:18.660 --> 00:28:20.960 +so I'm happy to hear that there might be a better way. + +00:28:21.580 --> 00:28:21.880 +Yeah. + +00:28:23.560 --> 00:28:26.760 +Yeah, I think people do underappreciate just how much + +00:28:27.080 --> 00:28:29.520 +performance you got out of Python web apps. + +00:28:29.780 --> 00:28:32.080 +You know, they're like, oh, we're + +00:28:32.080 --> 00:28:34.940 +going to need to rewrite this in something else because the gil + +00:28:35.140 --> 00:28:35.620 +or whatever. + +00:28:35.920 --> 00:28:41.640 +Like I decided just to make a point to pull up the tail till my log running court, + +00:28:41.670 --> 00:28:41.940 +by the way. + +00:28:42.540 --> 00:28:46.520 +And each one of these requests is doing like multiple DB calls and it's like 23 + +00:28:46.820 --> 00:28:50.960 +milliseconds, six milliseconds, three milliseconds, nine milliseconds. + +00:28:51.180 --> 00:28:54.860 +It's like, that's good enough for, + +00:28:55.480 --> 00:28:59.080 +that's a lot of requests per second per worker until you got to, + +00:28:59.080 --> 00:28:59.880 +you got to have a lot of traffic. + +00:29:01.840 --> 00:29:04.760 +Speaking of court, Phil, what's your take on this one? + +00:29:05.920 --> 00:29:06.800 +I think it's very similar. + +00:29:07.100 --> 00:29:08.960 +I also build Docker containers + +00:29:10.860 --> 00:29:12.960 +with a Postgres database on the back end. + +00:29:13.080 --> 00:29:15.840 +And I run Hypercorn as the ASCII server + +00:29:16.460 --> 00:29:18.740 +and put them behind an AWS load balancer + +00:29:19.180 --> 00:29:20.400 +and just run them in ECS. + +00:29:20.620 --> 00:29:22.480 +And I think it's pretty simple, + +00:29:22.760 --> 00:29:24.360 +but I guess it depends on your biases. + +00:29:24.640 --> 00:29:26.540 +But yeah, that's all we do, really. + +00:29:26.660 --> 00:29:27.460 +And it goes a long way. + +00:29:27.620 --> 00:29:29.720 +There are multiple ECS tasks, + +00:29:30.160 --> 00:29:32.400 +mostly because if one falls over rather than scaling, + +00:29:32.600 --> 00:29:35.240 +it's usually the database that you need to scale, I find. + +00:29:36.000 --> 00:29:37.720 +yeah that's how we run it + +00:29:37.890 --> 00:29:39.760 +the nice thing for me about Hypercorn + +00:29:40.160 --> 00:29:41.720 +is that I can play with HTTP3 + +00:29:41.980 --> 00:29:43.360 +so that's what we're doing at times + +00:29:45.700 --> 00:29:46.440 +HTTP3 okay + +00:29:47.300 --> 00:29:49.460 +I've just been getting my HTTP2 game down + +00:29:49.640 --> 00:29:51.240 +so I'm already behind the game + +00:29:53.060 --> 00:29:54.300 +what's the deal with HTTP3? + +00:29:56.900 --> 00:29:57.760 +well I mean it's + +00:29:58.540 --> 00:29:59.880 +obviously a totally new way of doing it + +00:29:59.910 --> 00:30:01.820 +over UDP now rather than TCP + +00:30:03.040 --> 00:30:04.540 +although at the application level + +00:30:04.560 --> 00:30:05.920 +you can't tell any difference really. + +00:30:06.180 --> 00:30:08.520 +But I mean, I just find it interesting. + +00:30:08.720 --> 00:30:10.600 +I'm not really sure it will help too much. + +00:30:10.960 --> 00:30:13.000 +And it's probably best if you've got users + +00:30:13.090 --> 00:30:15.460 +who have not that great a network connection. + +00:30:16.140 --> 00:30:19.600 +But for most other cases, it doesn't matter too much. + +00:30:19.980 --> 00:30:24.600 +Just keep blasting packets until some of them get through. + +00:30:25.220 --> 00:30:25.860 +Okay, fine. + +00:30:25.860 --> 00:30:26.780 +We'll give you a page eventually. + +00:30:26.960 --> 00:30:27.760 +Here's three pages actually. + +00:30:29.700 --> 00:30:30.340 +All right, Sebastian. + +00:30:31.240 --> 00:30:34.800 +You are running not just FastAPI from your experience, + +00:30:35.080 --> 00:30:37.480 +but you're running FastAPI for a ton of people + +00:30:37.820 --> 00:30:41.240 +through FastAPI Cloud at, I'm sure, many different levels. + +00:30:42.740 --> 00:30:46.280 +Yeah, so this probably sounds like a shameless plug. + +00:30:46.280 --> 00:30:48.840 +And it kind of is, but it's sort of expected. + +00:30:50.100 --> 00:30:52.140 +I will deploy FastAPI or FastAPI Cloud. + +00:30:52.700 --> 00:30:54.140 +Just because-- well, the idea is just + +00:30:54.140 --> 00:30:55.940 +to make it super simple to do that. + +00:30:56.320 --> 00:30:59.620 +If you are being able to run the command FastAPI run, + +00:31:00.040 --> 00:31:06.300 +fast api run has like the production server that is using ubicor underneath and if you can run that + +00:31:06.460 --> 00:31:11.880 +then you can run also fast api deploy and then like you know like it will most probably just work + +00:31:12.600 --> 00:31:18.760 +and you know we just wrap everything and like deploy build install deploy handle https all the + +00:31:18.780 --> 00:31:23.740 +stuff without needing any docker file or anything like that and i think for many use cases it's just + +00:31:23.840 --> 00:31:27.860 +like simpler being able just to do that there are there are so many projects that i have been now + +00:31:27.880 --> 00:31:33.560 +building, you're not a random stuff that is not really important, but now I can. And before + +00:31:33.700 --> 00:31:38.000 +it was like, yeah, well, I know how to deploy this thing, like fully with like all the bells + +00:31:38.020 --> 00:31:42.940 +and whistles, but it's just so much work that yeah, no later. So for that, I will end up just + +00:31:42.980 --> 00:31:48.960 +like going with that. Now, if I didn't, sorry, go ahead. Well, what I was going to ask is + +00:31:50.200 --> 00:31:54.380 +how much are you willing to tell us how things run inside FastAPI Cloud? + +00:31:55.660 --> 00:32:02.240 +oh like it's just so much stuff i'm sure i mean you know like it's like and it's also it's fun + +00:32:02.260 --> 00:32:06.180 +that you know like nowadays that they're like uh you know like we have docker and we have + +00:32:06.260 --> 00:32:10.740 +to get swarm and there there was nomad and kubernetes and oh kubernetes one and then we + +00:32:10.820 --> 00:32:16.580 +have the cloud providers and there's you know like aws and like google and asher and you will expect + +00:32:16.760 --> 00:32:22.920 +that all these things and all this complexity is like now that it's like okay these are the clear + +00:32:22.920 --> 00:32:28.700 +winner so it's like a lot of complexity to take on but like you once you do it all works but it + +00:32:28.820 --> 00:32:34.940 +doesn't and it's just like so much you know like so much work to get things to work together to + +00:32:35.080 --> 00:32:40.180 +work correctly and the official you know like the official resources from the different providers + +00:32:40.280 --> 00:32:45.040 +and things in many cases like oh the solution is hidden in this issue somewhere in github because + +00:32:45.160 --> 00:32:49.940 +the previous version was obsolete but now the new version of like this package or whatever is like + +00:32:49.800 --> 00:32:57.460 +you know like it's just it's crazy but like yeah so then if i didn't have fast api cloud + +00:32:58.320 --> 00:33:02.780 +uh i will probably use containers i will probably use docker if it's like something simple i will + +00:33:02.800 --> 00:33:08.060 +deploy with a docker compose probably try to scale the minimum replicas i don't remember + +00:33:08.500 --> 00:33:13.380 +docker compose has that i remember that docker swarm had that but then docker sort of lost + +00:33:13.400 --> 00:33:21.420 +against where net is um i will put a traffic a load balancer in front to handle https and yeah + +00:33:21.430 --> 00:33:27.660 +well like regular load balancing and uh yeah just regular uvicon what we were what some of the + +00:33:27.660 --> 00:33:33.940 +folks we were talking about before at some point we needed to have unicorn on top of ubicorne because + +00:33:34.100 --> 00:33:38.820 +ubicorne wouldn't be able to handle workers but now ubicorne can handle its workers like everything + +00:33:38.840 --> 00:33:43.780 +and like handle the main thing was the zombie processes and like uh you know like ripping + +00:33:44.240 --> 00:33:48.620 +the processes and like handling the stuff now it's conscious to that so you can just like run + +00:33:49.160 --> 00:33:54.680 +playing ubicarn so if you are using fast api and you say fast api run that already does that so if + +00:33:54.740 --> 00:33:59.480 +you're deploying on your own you can just use the fast api run command then of course you have to + +00:33:59.520 --> 00:34:04.980 +deal with scaling and https and balancing like all the stuff but like you know like the core + +00:34:05.880 --> 00:34:12.440 +server you can just like run it directly if going beyond that then like will probably be like some + +00:34:12.879 --> 00:34:18.820 +cluster kubernetes i'm trying to scale things figure out the ways to scale things based on + +00:34:19.120 --> 00:34:25.860 +the load of the requests you know like scaling automatically i having one normally one container + +00:34:26.220 --> 00:34:30.480 +per process to be able to scale that more dynamically without depending on the local memory + +00:34:30.700 --> 00:34:35.840 +for each one of the servers and things like that i probably saying too much but yeah actually you + +00:34:35.860 --> 00:34:48.000 +If I didn't have a CPI cloud, I would probably use one of the providers that abstract those things a little bit away, you know, like render, railway, fly, like, I don't know. + +00:34:48.280 --> 00:35:01.200 +I don't really think that a regular developer should be dealing with, you know, like the big hyperscalers and like Kubernetes and like all the complexity for a common app. + +00:35:01.560 --> 00:35:06.460 +Most of the cases I think it's just really too much complexity to deal with. + +00:35:07.560 --> 00:35:12.020 +It's kind of eye-watering to open up the AWS console or Azure or something. + +00:35:12.400 --> 00:35:12.540 +Whoa. + +00:35:13.480 --> 00:35:19.820 +The other day, in one of the AWS accounts, I had to change the account email. + +00:35:20.320 --> 00:35:21.420 +I think I spent four hours. + +00:35:22.120 --> 00:35:22.600 +I know. + +00:35:23.660 --> 00:35:27.360 +Because I had to create a delegate account that has the right permissions. + +00:35:27.720 --> 00:35:33.020 +often like oh no this is you know like it's sometimes it's just overwhelming the amount + +00:35:33.020 --> 00:35:39.220 +of complexity that needs to be dealt with and uh yeah i mean it's it's great to really have like + +00:35:39.350 --> 00:35:45.760 +you know like the uh infra people that i have working with me at the company that i can deal + +00:35:45.880 --> 00:35:50.380 +with all that mess and like and make sure that everything is just running perfectly and + +00:35:51.200 --> 00:35:57.680 +it just works so it's like you know like sort of uh sre as a service they work as a service for + +00:35:57.860 --> 00:36:01.560 +It's like a cloud that provides the old social site research. + +00:36:02.000 --> 00:36:02.560 +Yeah, that's awesome. + +00:36:03.940 --> 00:36:07.100 +So I spent a number of years doing nothing but cloud migrations + +00:36:07.220 --> 00:36:09.680 +to these hyperscalers for like enterprises. + +00:36:10.280 --> 00:36:13.820 +And I can tell you that when you mentioned the eye-watering comment + +00:36:14.080 --> 00:36:15.560 +about like the network and all that stuff, + +00:36:15.700 --> 00:36:17.940 +it's so incredibly complicated now, right? + +00:36:17.980 --> 00:36:20.920 +Like there's literally every kind of concept that you need to know + +00:36:21.060 --> 00:36:25.820 +to deploy these enterprises now and move them from on-prem to the cloud. + +00:36:25.840 --> 00:36:27.760 +So it does get incredibly complicated. + +00:36:27.910 --> 00:36:30.560 +Having something simple like what Sebastian is talking about, + +00:36:30.570 --> 00:36:31.940 +I think is, is, you know, + +00:36:32.270 --> 00:36:35.140 +super helpful when you're just trying to get started and get something up and + +00:36:35.240 --> 00:36:36.080 +running, you know, quickly. + +00:36:37.000 --> 00:36:39.540 +Yeah, for sure. All right. + +00:36:40.380 --> 00:36:43.560 +I've got a lot of questions and I realized that we will not be getting through + +00:36:43.630 --> 00:36:47.320 +all of them. So I want to, I want to pick carefully. + +00:36:47.540 --> 00:36:50.060 +So let's do, let's do this one next. + +00:36:52.039 --> 00:36:54.120 +Performance. What's your best + +00:36:55.260 --> 00:37:01.180 +low effort tip, not like something super complicated, but I know there's a bunch of + +00:37:01.190 --> 00:37:07.040 +low hanging fruit that people maybe missed out on. And this time, let's start with Litestar. + +00:37:07.510 --> 00:37:12.520 +Cody, back at you. All right. Well, so I'm going to stick to what I know, which is databases, + +00:37:12.730 --> 00:37:17.580 +because I deal with that every single day. There's a couple of things that I see as like + +00:37:17.600 --> 00:37:24.960 +gotchas that I constantly see over and over. One, SeqWalcom, E-Trans, kind of obfuscates + +00:37:25.900 --> 00:37:30.120 +the way it's going to execute things and what kind of queries it's going to actually execute. So + +00:37:30.440 --> 00:37:37.060 +it's really easy if you're not kind of fluent in how it works to create N plus one types of issues. + +00:37:37.160 --> 00:37:41.860 +And so when people start talking about sync or async, it's really, in my mind, it's less of that + +00:37:41.920 --> 00:37:45.860 +because you're going to spend more time waiting on the network and database and those kinds of + +00:37:45.800 --> 00:37:51.240 +things, then you're going to spend serializing just generally, right? And or processing things + +00:37:51.260 --> 00:37:58.260 +on the web framework. And so one, making sure that you have your relationships dialed in correctly + +00:37:58.340 --> 00:38:07.420 +so that you don't have N plus one queries. The other thing is oversized connection pooling into + +00:38:07.580 --> 00:38:13.140 +Postgres and just databases in general, because what people don't tend to know is that each of + +00:38:13.000 --> 00:38:17.740 +those connections takes up CPU cycles and RAM of the database. And so when you slam the database + +00:38:17.960 --> 00:38:23.180 +with hundreds of connections, you're just taking away processing power that can be done for other + +00:38:23.340 --> 00:38:27.860 +things, right? And so you end up ultimately slowing things down. So I've seen databases that + +00:38:28.240 --> 00:38:32.620 +have had so many connections that all of the CPU and all the stuff is actually doing things, + +00:38:32.740 --> 00:38:35.020 +just managing connections and can't actually do any database work. + +00:38:35.160 --> 00:38:39.440 +What about this socket? Is it busy? What about this socket? Is it busy? It's just round robin + +00:38:39.460 --> 00:38:46.720 +that, right? Exactly. So yeah, so paying attention to the database is kind of my first kind of rule + +00:38:46.740 --> 00:38:53.080 +of thumb. 100%. I like that one a lot. Yannick. Yeah, I'll throw in putting stuff or identifying + +00:38:54.320 --> 00:38:58.540 +work that doesn't need to be done immediately for the user and putting in a background task. + +00:38:59.340 --> 00:39:05.780 +Having a background worker defer things till later. So sending email is an example, + +00:39:05.840 --> 00:39:11.680 +although there's nuances there about knowing that it's sent and everything but um yeah if you if + +00:39:11.820 --> 00:39:18.360 +you if user kicks off some process and then you wait to do that process in the worker you're + +00:39:18.460 --> 00:39:24.940 +holding that worker up which is more relevant in wsgi than asgi but uh and you're making them wait + +00:39:25.060 --> 00:39:31.980 +for their page to load again versus record what they wanted to do send it off to the background + +00:39:32.000 --> 00:39:35.520 +let them see the status of it, but let the background worker handle it. + +00:39:36.060 --> 00:39:37.080 +Yeah, that's great advice. + +00:39:39.040 --> 00:39:40.660 +All right, Yannick, I said, you guys, go for it. + +00:39:40.760 --> 00:39:40.960 +Yeah, yeah. + +00:39:43.540 --> 00:39:50.780 +Well, I want to, I'm not sure if that's some sort of, it's not really a trick or a tip, + +00:39:50.860 --> 00:39:58.240 +or more like, I think the most common mistake I see when I, this is ASCII specific, + +00:39:59.060 --> 00:40:01.240 +but when I look at ASCII apps that people have written, + +00:40:01.770 --> 00:40:06.200 +who are maybe not as familiar with ASCII or async Python at all, + +00:40:07.120 --> 00:40:13.400 +is if you make something an async function, + +00:40:13.560 --> 00:40:16.540 +you should be absolutely sure that it's non-blocking. + +00:40:17.519 --> 00:40:23.140 +Because if you're running an ASCII app and you're blocking anywhere, + +00:40:23.580 --> 00:40:25.920 +your whole application server is blocked completely. + +00:40:25.990 --> 00:40:28.420 +It doesn't handle any other requests at the same time. + +00:40:28.440 --> 00:40:29.580 +It's blocked fully. + +00:40:30.480 --> 00:40:39.500 +And I don't think I've had any mistake more times + +00:40:39.790 --> 00:40:44.140 +when I've looked through some apps that someone has written + +00:40:44.290 --> 00:40:47.040 +or that I've came across somewhere. + +00:40:47.890 --> 00:40:50.460 +So this is really, it's super, super common. + +00:40:50.530 --> 00:40:55.160 +And it has such a big impact on the overall performance + +00:40:56.380 --> 00:40:59.660 +in every metric imaginable. + +00:41:00.200 --> 00:41:01.000 +So I would say, + +00:41:02.320 --> 00:41:05.180 +and that's nowadays what I tell people, + +00:41:06.440 --> 00:41:08.960 +unless you're 100% sure that you know what you're doing + +00:41:09.140 --> 00:41:11.460 +and you know it's non-blocking, + +00:41:12.140 --> 00:41:13.180 +don't make it async. + +00:41:13.740 --> 00:41:16.640 +Put it in a thread, execute it in a thread, whatever. + +00:41:17.260 --> 00:41:19.540 +All of the ASCII frameworks and Django + +00:41:21.560 --> 00:41:23.500 +give you a lot of tools at hand + +00:41:23.520 --> 00:41:30.440 +to translate your stuff from sync to async so you can still run it, do that unless you're + +00:41:30.660 --> 00:41:34.380 +very sure that it actually fully supports async. + +00:41:35.300 --> 00:41:37.920 +Yeah, that's good advice. Sebastian. + +00:41:41.020 --> 00:41:48.620 +Hey, I'm actually going to second, Yannick. I think, yeah, like it's maybe counterintuitive + +00:41:48.640 --> 00:41:54.560 +that like one of the tips of performance is to try to not optimize that much performance at the + +00:41:54.640 --> 00:42:00.000 +beginning you know like async the idea with async is like oh you can get so much performance and + +00:42:00.100 --> 00:42:06.740 +throughput in terms of accuracy whatever but the thing is in most of the cases uh you know like + +00:42:07.160 --> 00:42:12.640 +until apps grow so large they actually don't need that much extra throughput that much extra + +00:42:12.560 --> 00:42:19.080 +performance and in a framework like you know like as janet was saying well in my case i know fast + +00:42:19.120 --> 00:42:24.020 +api but like you know like also many others if you define the function with async it's going to be + +00:42:24.180 --> 00:42:29.140 +run async if you define it non-async and regular depth it's going to be run on a thread worker + +00:42:29.380 --> 00:42:36.140 +automatically so it's just going to do the smart thing automatically so it's like fair you know + +00:42:36.140 --> 00:42:41.740 +like it's going to be good enough and then you can just start with that and just keep blocking + +00:42:41.760 --> 00:42:46.620 +code everywhere you know like just not use async until you actually know that you really need to + +00:42:46.720 --> 00:42:53.980 +use async and once you do you have to be as you know like 100 sure that you are not running blocking + +00:42:54.160 --> 00:43:00.160 +code inside of it and if you need to run blocking code inside of async code then make sure that you + +00:43:00.220 --> 00:43:07.580 +are sending it through to a thread worker sending it through to a thread worker sounds uh down thing + +00:43:07.600 --> 00:43:13.680 +but yeah like you know like django has tools any io has tools i also built something on top of any + +00:43:13.700 --> 00:43:19.020 +iog all the sinker that is just to simplify these things to asyncify a blocking function + +00:43:19.740 --> 00:43:23.820 +keeping all the type information so you get out of completion and inline errors and everything + +00:43:24.420 --> 00:43:29.560 +even though it's actually doing all the stuff of sending the thing to the to the thread worker so + +00:43:29.760 --> 00:43:33.900 +the goal is super simple you keep very simple code but then in there underneath it's just like doing + +00:43:33.940 --> 00:43:38.120 +all the stuff that should be doing but you know like that's normally when you actually need to + +00:43:38.740 --> 00:43:44.340 +hyper optimize things in most of the cases you can just start with just not using async at first + +00:43:45.140 --> 00:43:49.660 +also now that you're gonna have python multi-threaded then suddenly you're gonna have + +00:43:49.920 --> 00:43:56.240 +just so much more performance out of the blue without even having to do much more so yeah yeah + +00:43:56.440 --> 00:44:02.380 +actually that's you know like sorry i kept speaking so much but the here's a tip for + +00:44:02.400 --> 00:44:09.280 +improving performance uh upgrade your patient python version i was just chatting today with savannah + +00:44:11.240 --> 00:44:16.940 +she was adding the benchmarks to the you know like python benchmark the python the official + +00:44:17.120 --> 00:44:23.480 +python bearing benchmarks that they run for the c python faster the faster c python program and + +00:44:24.800 --> 00:44:34.980 +the change from Python 3.10 to Python 3.14 when running FastAPI is like almost double the + +00:44:35.280 --> 00:44:40.220 +performance or something like that. It was like, it was crazy. It was just crazy improvement in + +00:44:40.340 --> 00:44:45.260 +performance. So you can just upgrade your Python version, you're going to get so much better + +00:44:45.440 --> 00:44:51.520 +performance. That's an awesome piece of advice that I think is often overlooked. And it's not + +00:44:51.440 --> 00:44:54.260 +Not only CPU speed, it's also memory gets a lot lower. + +00:44:54.800 --> 00:44:55.580 +Whoever's going to jump in, go ahead. + +00:44:56.260 --> 00:44:56.600 +Oh, yeah. + +00:44:57.400 --> 00:45:02.660 +Last year, I was looking at Markup Safe, which is an HTML escaping library that we use. + +00:45:03.580 --> 00:45:05.500 +And it has a C extension for speedups. + +00:45:05.560 --> 00:45:12.580 +And I almost convinced myself that I can stop maintaining the C extension because just Python itself got way faster. + +00:45:12.840 --> 00:45:16.820 +But then it turned out that I could do something to the C extension to make it faster also. + +00:45:18.220 --> 00:45:27.800 +But just the fact that I almost convinced myself, like, oh, I can drop a C extension for just a Python upgrade instead was pretty impressive. + +00:45:28.380 --> 00:45:29.180 +Yeah, that's really cool. + +00:45:30.140 --> 00:45:34.520 +Especially with, like, string handling, you know, which you're going to use for templating for web apps. + +00:45:37.240 --> 00:45:37.400 +Phil? + +00:45:38.900 --> 00:45:45.760 +Yeah, well, I definitely echo looking at your DB queries, because by far and large, that's always where our performance issues have been. + +00:45:45.940 --> 00:45:49.720 +it's either badly written query or we're returning most of the database when the user just wants + +00:45:49.720 --> 00:45:54.580 +to know about one thing or something silly like that. But I was thinking about low hanging ones, + +00:45:54.670 --> 00:46:00.440 +which I think you asked about. So I'd say uv loop, which is still a noticeable improvement. + +00:46:01.380 --> 00:46:06.320 +And also because I think it's likely a lot of us are returning JSON, often changing the + +00:46:06.820 --> 00:46:11.780 +JSON serializer to one of the faster ones can be noticeable as well and obviously quite easy to do. + +00:46:12.040 --> 00:46:14.980 +So yeah, that's really good advice. + +00:46:15.080 --> 00:46:16.800 +I didn't think about the JSON serializer. + +00:46:17.460 --> 00:46:18.100 +What one do you recommend? + +00:46:19.440 --> 00:46:22.220 +I think is it you, JSON, or is it all JSON? + +00:46:22.780 --> 00:46:23.840 +I can't remember which one was different. + +00:46:23.980 --> 00:46:24.240 +Yeah, yeah. + +00:46:24.520 --> 00:46:25.080 +Replace it. + +00:46:27.000 --> 00:46:30.160 +But yeah, if you look at the Tech Empower benchmarks, + +00:46:30.480 --> 00:46:32.420 +everyone's changing their JSON serializer + +00:46:32.600 --> 00:46:33.760 +to get that bit extra speed. + +00:46:33.940 --> 00:46:34.280 +It's-- + +00:46:34.620 --> 00:46:35.820 +Yeah, you're like, our framework looks + +00:46:36.100 --> 00:46:38.280 +bad because our JSON serializer is like third + +00:46:38.300 --> 00:46:38.920 +of the performance. + +00:46:39.480 --> 00:46:39.660 +Yeah. + +00:46:41.540 --> 00:46:44.900 +We changed, well, David added a JSON provider to Flask, + +00:46:45.100 --> 00:46:47.160 +and yeah, you could see it make a difference + +00:46:47.270 --> 00:46:49.200 +in the tech and power benchmarks, so that was really good. + +00:46:49.200 --> 00:46:49.620 +Oh, interesting. + +00:46:50.540 --> 00:46:50.840 +Yeah, cool. + +00:46:51.800 --> 00:46:52.880 +Yeah, it's pluggable now, + +00:46:53.100 --> 00:46:56.300 +but if you're not installing Flask or JSON, + +00:46:56.640 --> 00:46:59.540 +I mean, I don't know what other JSON library + +00:46:59.740 --> 00:47:00.580 +you'd be using at this point, + +00:47:00.590 --> 00:47:01.540 +unless you're already using one, + +00:47:01.880 --> 00:47:04.180 +but or JSON is very, very fast. + +00:47:04.860 --> 00:47:06.520 +Okay, this is something I'm going to be looking into later. + +00:47:08.699 --> 00:47:10.380 +So over to Django, Jeff, + +00:47:10.640 --> 00:47:14.680 +David talked about running stuff in the background and I said Django five or + +00:47:14.840 --> 00:47:16.540 +Django six that got the background task thing. + +00:47:17.300 --> 00:47:17.420 +Yeah. + +00:47:17.480 --> 00:47:17.940 +Django six, + +00:47:18.010 --> 00:47:21.640 +which just came out a couple of weeks ago and I'll hand that off to Carlton in + +00:47:21.640 --> 00:47:21.980 +a second. + +00:47:22.160 --> 00:47:22.600 +Cause I think, + +00:47:22.970 --> 00:47:23.060 +um, + +00:47:23.420 --> 00:47:26.020 +problems had more to do with the actual plumbing, + +00:47:26.450 --> 00:47:26.540 +uh, + +00:47:26.660 --> 00:47:27.580 +being on the steering council. + +00:47:28.230 --> 00:47:28.320 +Uh, + +00:47:28.580 --> 00:47:31.600 +my advice to people is the best way to scale something is just to not do it, + +00:47:31.660 --> 00:47:32.820 +avoid the process completely. + +00:47:32.990 --> 00:47:34.440 +So like I mentioned the CDN earlier, + +00:47:34.800 --> 00:47:35.720 +it's content heavy science, + +00:47:35.900 --> 00:47:36.960 +cash the crap out of stuff. + +00:47:37.390 --> 00:47:37.480 +Um, + +00:47:37.640 --> 00:47:38.660 +it doesn't even have to hit your servers. + +00:47:39.270 --> 00:47:39.360 +Um, + +00:47:39.510 --> 00:47:40.280 +you can go a lot, + +00:47:40.340 --> 00:47:41.220 +as we mentioned earlier too, + +00:47:41.380 --> 00:47:43.620 +just by doubling the amount of resources a project has. + +00:47:44.540 --> 00:47:45.880 +Django is pretty efficient these days, + +00:47:46.160 --> 00:47:47.040 +especially with async views. + +00:47:48.300 --> 00:47:49.240 +Like everybody else has said too, + +00:47:50.180 --> 00:47:52.780 +any blocking code, move off to threads, + +00:47:53.280 --> 00:47:54.800 +move off to a background queue. + +00:47:55.600 --> 00:47:57.860 +Django Q2 is my favorite one to use + +00:47:57.880 --> 00:47:58.900 +because you can use a database. + +00:47:59.640 --> 00:48:00.880 +So for those little side projects + +00:48:00.960 --> 00:48:02.680 +where you just want to run one or two processes, + +00:48:02.860 --> 00:48:03.420 +you can use it. + +00:48:03.420 --> 00:48:03.900 +It works great. + +00:48:05.520 --> 00:48:08.560 +And Carlton, if you want to talk about Django internals. + +00:48:09.300 --> 00:48:10.020 +Yeah, okay. + +00:48:10.200 --> 00:48:12.260 +So the new task framework I just mentioned, + +00:48:12.850 --> 00:48:15.920 +the main sort of bit about it is that it's, again, + +00:48:16.070 --> 00:48:17.400 +this pluggable Django API. + +00:48:17.610 --> 00:48:19.360 +So it gives a standard task API. + +00:48:19.470 --> 00:48:20.920 +So if you're writing a third-party library, + +00:48:21.190 --> 00:48:23.700 +and you need to send an email in, + +00:48:23.880 --> 00:48:25.100 +it's the canonical example, right? + +00:48:25.140 --> 00:48:27.160 +You need to send an email in your third-party library + +00:48:27.400 --> 00:48:28.840 +before you'd have had to tie yourself + +00:48:29.010 --> 00:48:30.340 +to a specific queue implementation, + +00:48:30.990 --> 00:48:32.600 +whereas now Django's providing a kind of, + +00:48:33.480 --> 00:48:34.920 +like an ORM of tasks. + +00:48:35.000 --> 00:48:35.460 +Right, right. + +00:48:35.460 --> 00:48:37.060 +You got to do Redis, you got to do Celery, + +00:48:37.290 --> 00:48:38.960 +and you got to manage things and all that. + +00:48:39.020 --> 00:48:41.960 +You don't have to pick that now as the third-party package author. + +00:48:42.040 --> 00:48:45.780 +You can just say, right, just wrap this as a Django task and queue it. + +00:48:46.080 --> 00:48:49.920 +And then the developer, when they come to choose their backend, + +00:48:50.040 --> 00:48:52.080 +if they want to use Celery or they want to use Django Q2 + +00:48:52.200 --> 00:48:53.860 +or they want to use the Django task backend, + +00:48:54.080 --> 00:48:58.420 +which Jake Howard, who wrote this for Django provided as well, + +00:48:58.640 --> 00:49:00.020 +you can just plug that in. + +00:49:00.280 --> 00:49:02.780 +So it's a pluggable interface for tasks, + +00:49:03.020 --> 00:49:05.860 +which is, I think, the really nice thing about it. + +00:49:06.280 --> 00:49:09.320 +In terms of quick wins, everybody's mentioned almost all of mine. + +00:49:10.120 --> 00:49:12.240 +Cody and Phil, they mentioned the database. + +00:49:12.740 --> 00:49:13.420 +That's the big one. + +00:49:13.620 --> 00:49:17.100 +Django, the ORM, because it does lazy related lookups, + +00:49:17.240 --> 00:49:22.080 +it's very easy to trigger M plus one where a book has multiple authors + +00:49:22.300 --> 00:49:23.940 +and suddenly you're iterating through the books + +00:49:24.080 --> 00:49:26.040 +and you're iterating through the authors and it's a lookup. + +00:49:26.560 --> 00:49:29.280 +So you need to do things like prefetch related, select related. + +00:49:29.380 --> 00:49:30.980 +You need to just check that you've got those. + +00:49:31.180 --> 00:49:33.840 +Django debug toolbars are a great thing to run in development + +00:49:34.000 --> 00:49:35.680 +where you can see the queries and it'll tell you + +00:49:35.680 --> 00:49:36.660 +where you've got the duplicates. + +00:49:37.280 --> 00:49:39.980 +Then the slightly bigger one is to just check your indexes. + +00:49:40.300 --> 00:49:42.480 +The ORM will create the right indexes + +00:49:42.500 --> 00:49:45.460 +if you're going through primary keys or unique fields, + +00:49:45.500 --> 00:49:48.180 +but sometimes you're doing a filter on some field, + +00:49:48.740 --> 00:49:50.400 +and then there's not the right index there, + +00:49:50.400 --> 00:49:51.480 +and that can really slow you down. + +00:49:51.660 --> 00:49:56.000 +Again, you can do the SQL explain on that and find that. + +00:49:56.440 --> 00:49:59.360 +Then the thing I was going to say originally was caching. + +00:50:00.540 --> 00:50:04.080 +Get a Redis instance, stick it next to your Django app, + +00:50:04.180 --> 00:50:07.020 +and as Jeff said, don't do the work. + +00:50:07.020 --> 00:50:09.540 +If you're continually rendering the same page + +00:50:09.580 --> 00:50:10.380 +and it never changes, + +00:50:11.080 --> 00:50:12.740 +cache it and pull it from the cache + +00:50:12.910 --> 00:50:13.420 +rather than rendering. + +00:50:13.580 --> 00:50:13.880 +Because template, + +00:50:14.520 --> 00:50:16.000 +DB queries are one of your biggest things. + +00:50:16.080 --> 00:50:18.040 +The second one's always going to be serialization. + +00:50:18.160 --> 00:50:19.860 +That's either serialization or template rendering. + +00:50:20.220 --> 00:50:22.460 +So if you can avoid that by caching, + +00:50:22.800 --> 00:50:24.660 +you can save an awful lot of time on your requests. + +00:50:24.960 --> 00:50:25.140 +Yeah. + +00:50:25.590 --> 00:50:26.940 +I was wondering if somebody would come back + +00:50:26.970 --> 00:50:28.020 +with database indexes + +00:50:28.120 --> 00:50:32.760 +because that's like a 100x multiplier for free almost. + +00:50:32.920 --> 00:50:34.920 +It's such a big deal. + +00:50:35.200 --> 00:50:36.040 +It really can be. + +00:50:36.100 --> 00:50:37.380 +If you're making a particular query + +00:50:37.460 --> 00:50:38.600 +and it's doing a full table scan + +00:50:38.720 --> 00:50:40.180 +and all of a sudden you put the index in, + +00:50:40.280 --> 00:50:40.820 +it's instant. + +00:50:41.160 --> 00:50:41.700 +It's like, oh, wow. + +00:50:42.360 --> 00:50:44.200 +And you don't have to be a DBA + +00:50:44.420 --> 00:50:47.460 +or master information architect sort of thing. + +00:50:47.880 --> 00:50:48.880 +I don't know about Postgres. + +00:50:49.000 --> 00:50:49.660 +I'm sure it has it. + +00:50:49.720 --> 00:50:50.420 +Somebody can tell me. + +00:50:50.600 --> 00:50:53.380 +But with Mongo, you can turn on in the database, + +00:50:54.140 --> 00:50:55.780 +I want you to log all slow queries + +00:50:56.040 --> 00:50:58.820 +and slow for me means 20 millisecond or whatever. + +00:50:58.940 --> 00:50:59.880 +Like you put a number in + +00:51:00.380 --> 00:51:02.220 +and then you run your app for a while + +00:51:02.160 --> 00:51:04.660 +and you go look at what's slow, sort by slowest. + +00:51:04.820 --> 00:51:06.620 +And then you can see, well, maybe that needs an index, right? + +00:51:06.840 --> 00:51:09.060 +Like just let your app tell you what you got to do. + +00:51:10.240 --> 00:51:11.660 +Yeah, there is a post. + +00:51:11.860 --> 00:51:13.560 +I'm just trying to see if I can quickly look it up now. + +00:51:13.700 --> 00:51:14.800 +There's a Postgres extension, + +00:51:15.200 --> 00:51:18.360 +which will automatically run explain on the slow queries + +00:51:18.640 --> 00:51:19.360 +and log them for you. + +00:51:19.500 --> 00:51:20.640 +Nice. There you go. + +00:51:21.480 --> 00:51:23.820 +It's PG stat statements, I think, is what we're thinking about. + +00:51:24.320 --> 00:51:24.740 +Right, okay. + +00:51:25.200 --> 00:51:25.520 +Awesome. + +00:51:26.759 --> 00:51:30.100 +If you're unsure about your database indexes, do this, + +00:51:30.560 --> 00:51:32.500 +or at least go back and review your queries. + +00:51:32.820 --> 00:51:33.400 +Yeah, I agree. + +00:51:34.200 --> 00:51:34.700 +Very good. + +00:51:35.300 --> 00:51:37.460 +All right, I can see we're blazing through these questions. + +00:51:38.200 --> 00:51:39.280 +I have one other. + +00:51:40.340 --> 00:51:41.100 +Yeah, please go ahead. + +00:51:41.240 --> 00:51:41.820 +Yeah, go ahead, David. + +00:51:42.720 --> 00:51:48.080 +If you want to get some more responsive parts of your website, + +00:51:48.300 --> 00:51:51.140 +like make your website a little more responsive or interactive + +00:51:51.320 --> 00:51:55.280 +with the user, htmx or datastar, especially + +00:51:55.620 --> 00:51:57.800 +like if you're using court or another ASGI, + +00:51:58.060 --> 00:52:01.240 +where you can do SSE, server-send events, or WebSockets. + +00:52:03.480 --> 00:52:06.340 +Streaming little bits of changes to the web front end + +00:52:06.340 --> 00:52:09.920 +and then rendering them with the same HTML you're already + +00:52:10.140 --> 00:52:13.680 +writing can make things a lot more responsive. + +00:52:14.050 --> 00:52:18.340 +We had a talk about that from Chris May at FlaskCon last year, + +00:52:18.720 --> 00:52:19.600 +which you can find on YouTube. + +00:52:20.280 --> 00:52:20.800 +Awesome. + +00:52:22.080 --> 00:52:25.580 +You know, this is not one of the questions, + +00:52:25.840 --> 00:52:29.240 +But let me just start out for a quick riff on this, folks. + +00:52:29.940 --> 00:52:32.320 +Out in the audience, someone was asking, what about HTMX? + +00:52:32.840 --> 00:52:35.840 +And I think more broadly, I am actually + +00:52:35.920 --> 00:52:40.100 +a huge fan of server-side-based, template-based apps. + +00:52:40.300 --> 00:52:42.400 +I think it just keeps things simpler in a lot of ways, + +00:52:42.500 --> 00:52:43.640 +unless you need a lot of interactivity. + +00:52:44.180 --> 00:52:47.620 +But things like HTMX or a little bit of JavaScript + +00:52:47.920 --> 00:52:50.740 +can reduce a lot of the traffic and stuff. + +00:52:50.860 --> 00:52:52.440 +Where do people land on those kinds of things? + +00:52:53.860 --> 00:52:55.480 +I absolutely love, love HTML. + +00:52:57.050 --> 00:53:02.700 +Not, not just because you don't have to write a lot of JavaScript or whatever, but mostly + +00:53:02.820 --> 00:53:09.660 +because if, if, if I'm just building a simple app that needs a bit more than just be a static + +00:53:10.080 --> 00:53:14.580 +HTML page, it needs some interactivity, a little bit of reactivity. + +00:53:15.500 --> 00:53:22.800 +I feel like having the whole overhead of building an SPA or whatever tools you need for the + +00:53:22.720 --> 00:53:24.840 +whole JavaScript, TypeScript, whatever stack. + +00:53:25.380 --> 00:53:28.460 +It's just so much work to get a little bit, + +00:53:28.680 --> 00:53:31.040 +to make a simple thing a little bit nicer, + +00:53:31.220 --> 00:53:32.100 +a little bit more reactive. + +00:53:32.670 --> 00:53:35.520 +And I feel like HTMX just fits right in there. + +00:53:35.660 --> 00:53:37.320 +It's super great. + +00:53:37.480 --> 00:53:39.300 +I've got a couple of things with it now, + +00:53:39.870 --> 00:53:42.940 +a few of my own projects, a few things that work. + +00:53:43.440 --> 00:53:46.060 +And it makes things so much easier where, + +00:53:47.160 --> 00:53:49.860 +I don't know, the work probably wouldn't have been done + +00:53:49.930 --> 00:53:51.700 +if it was just, you know, because it's too much. + +00:53:51.900 --> 00:53:56.840 +if you're doing a whole front end thing that you have then to deploy and build and whatever, + +00:53:57.360 --> 00:54:03.420 +or it would have been less nice. So it's an amazing, really amazing thing. + +00:54:04.820 --> 00:54:12.560 +As the maintainer and author, though, one of the things that is, well, it's not frustrating, + +00:54:12.840 --> 00:54:17.280 +but it's understandable is that HTMX is not for everybody, right? There's not like it's, + +00:54:17.360 --> 00:54:20.520 +you can't use HTMX in all occasions or Datastar, right? + +00:54:20.590 --> 00:54:23.160 +And so there are people that are always going to want to use React + +00:54:23.370 --> 00:54:25.680 +and there's going to be people that want to use all these other frameworks. + +00:54:25.790 --> 00:54:28.980 +And so having some cohesive way to make them all talk together, + +00:54:29.110 --> 00:54:29.860 +I think is important. + +00:54:30.620 --> 00:54:34.040 +I don't have that answer yet, but I just know that like, + +00:54:34.440 --> 00:54:36.840 +I can't always say HTMX is it, right? + +00:54:37.320 --> 00:54:39.860 +And then you'll have a great time because I'll inevitably meet somebody + +00:54:39.970 --> 00:54:41.700 +that says I need to do this and it's right. + +00:54:41.700 --> 00:54:45.500 +And it's in a single page application or something is more appropriate for that. + +00:54:45.560 --> 00:54:50.100 +And so, you know, it's obviously the right tool for the right job when you need it. + +00:54:50.240 --> 00:54:55.180 +But, you know, I want to make something that is cohesive depending on whatever library you want to use. + +00:54:56.620 --> 00:54:57.820 +I would throw one thing in there, though. + +00:54:57.910 --> 00:55:02.040 +I would rather somebody start with HTMLX than I would start with React if you don't need it. + +00:55:02.190 --> 00:55:03.740 +Because React can be total overkill. + +00:55:03.770 --> 00:55:05.080 +It can be great for some applications. + +00:55:05.720 --> 00:55:10.040 +But oftentimes, a consultant will see people like having an About page and they throw a React at it. + +00:55:10.160 --> 00:55:10.940 +It's like, why do you need that? + +00:55:11.420 --> 00:55:13.220 +Like, especially for small things with partials. + +00:55:13.900 --> 00:55:15.320 +You mean you don't want to start with Angular? + +00:55:16.980 --> 00:55:20.920 +You know, it's fine if you need it, but I don't think you really need it. + +00:55:21.110 --> 00:55:22.840 +Like, introduce tools as you need them. + +00:55:23.680 --> 00:55:28.720 +Django60 just added template partials, and I guess my job here is to hand off to Carlton, because this is his feature. + +00:55:29.580 --> 00:55:31.820 +Yeah, I was having to see that come in there, Carlton. Nice job. + +00:55:31.890 --> 00:55:33.800 +No, it's okay. Plug the new feature. + +00:55:33.890 --> 00:55:39.060 +So, I mean, I stepped down as a fellow in 2023 into a new business, + +00:55:39.440 --> 00:55:43.420 +and I read the essay about template fragments on the HTMLX website, + +00:55:43.720 --> 00:55:47.380 +where it's named reusable bits in the templates. + +00:55:48.000 --> 00:55:48.900 +And I was like, I need that. + +00:55:48.940 --> 00:55:50.520 +So I built Django template partials, + +00:55:50.900 --> 00:55:52.000 +released it as a third-party package, + +00:55:52.240 --> 00:55:54.820 +and it's now just been merged into core for Django 6.0. + +00:55:55.460 --> 00:55:57.220 +And I have to say about HTML, + +00:55:57.460 --> 00:55:59.180 +it's really changed the way I write websites. + +00:55:59.480 --> 00:56:00.480 +Before I was the fellow, + +00:56:00.480 --> 00:56:01.880 +I used to write mobile applications + +00:56:02.300 --> 00:56:04.460 +and do the front end of the mobile application, + +00:56:04.560 --> 00:56:06.580 +then the back end in Django using Django REST framework. + +00:56:06.780 --> 00:56:09.520 +And that's how I got into open source + +00:56:09.620 --> 00:56:10.700 +was via Django REST framework. + +00:56:11.520 --> 00:56:15.500 +And since starting the new application, we're three years in. + +00:56:15.620 --> 00:56:17.500 +We've hardly got a JSON endpoint in sight. + +00:56:17.720 --> 00:56:21.140 +It's like two, three, four of them in the whole application. + +00:56:21.560 --> 00:56:24.140 +And it's just a delight. + +00:56:24.320 --> 00:56:26.700 +Again, you asked me at the beginning, Michael, am I having fun? + +00:56:26.760 --> 00:56:28.040 +Yeah, I really am having fun. + +00:56:28.140 --> 00:56:29.160 +And HTMX is the reason. + +00:56:29.480 --> 00:56:33.460 +I do grant there are these use cases where it's not right, but go for it. + +00:56:36.960 --> 00:56:37.280 +Awesome. + +00:56:37.460 --> 00:56:40.080 +All right, let's talk about our last topic. + +00:56:40.400 --> 00:56:44.700 +And we have five minutes-ish to do that. + +00:56:44.730 --> 00:56:46.560 +So we got to stay on target quick. + +00:56:46.590 --> 00:56:48.780 +But let's just go around real quick here. + +00:56:50.240 --> 00:56:52.480 +We talked about upgrading the Python version, + +00:56:53.460 --> 00:56:55.500 +getting better performance out of it. + +00:56:56.480 --> 00:56:58.060 +I mentioned the lower memory side. + +00:56:58.410 --> 00:57:02.840 +But I think one of the under-appreciated aspects + +00:57:02.870 --> 00:57:03.420 +of this-- + +00:57:04.210 --> 00:57:08.399 +the Instagram team did a huge talk keynote on it a while ago-- + +00:57:08.420 --> 00:57:12.800 +is the memory that you run into when you start to scale out + +00:57:12.980 --> 00:57:13.860 +your stuff on the server. + +00:57:14.080 --> 00:57:15.920 +Because you're like, oh, I want to have four workers + +00:57:16.120 --> 00:57:18.000 +so I can have more concurrency because of the gills. + +00:57:18.200 --> 00:57:20.020 +So now you've got four copies of everything + +00:57:20.140 --> 00:57:21.960 +that you cache in memory and just like the runtime. + +00:57:22.180 --> 00:57:25.300 +And now you need eight gigs instead of what would have been + +00:57:25.560 --> 00:57:27.180 +one or who knows, right? + +00:57:28.640 --> 00:57:30.960 +But with free threaded Python coming on, + +00:57:31.660 --> 00:57:34.320 +which I've seen a couple of comments in the chat like, hey, + +00:57:34.500 --> 00:57:35.080 +tell us about this. + +00:57:37.460 --> 00:57:43.420 +we could have true concurrency and we wouldn't need to scale as much on the process side, I think, + +00:57:43.580 --> 00:57:47.220 +giving us both better performance and the ability to say, well, you actually have four times less + +00:57:47.500 --> 00:57:52.960 +memory, so you could run smaller servers or whatever. What's the free threaded story for + +00:57:53.010 --> 00:57:58.660 +all the frameworks? Carlton, let's go back to you for doing reverse. I'm really excited about it. I + +00:57:58.760 --> 00:58:01.799 +don't know how it's going to play out, but I'm really excited about it. All it can do is help + +00:58:01.860 --> 00:58:09.220 +django um you know the async story in django is nice and mature now uh but still most of it's + +00:58:09.220 --> 00:58:12.220 +sync like you know you're still going to default the sync you're still going to write your sync + +00:58:12.360 --> 00:58:16.520 +views you still got template rendering you know django's template template based kind of framework + +00:58:16.680 --> 00:58:25.959 +really you're still gonna want you're still gonna want to run things synchronously concurrently and + +00:58:26.860 --> 00:58:32.180 +proper threads are going to be yeah they can't but help i don't i don't know how it's going to roll + +00:58:32.240 --> 00:58:36.340 +out i'll let someone else go because i'm getting well yeah i just like elaborated that for people + +00:58:36.510 --> 00:58:43.100 +out there uh before we move on is you could set up your worker process to say i want you to actually + +00:58:43.360 --> 00:58:50.160 +run eight threads in this one worker process and when multiple requests come in they could both be + +00:58:50.340 --> 00:58:54.519 +sent off to the same worker to be processed and that allows that worker to do more unless the gill + +00:58:54.540 --> 00:58:59.900 +comes along and says, stop, you only get to do one thing in threads in Python. And all of a sudden, + +00:58:59.910 --> 00:59:04.260 +a lot of that falls down. This basically uncorks that and makes that easy all of a sudden. Even if + +00:59:04.270 --> 00:59:09.200 +you yourself are not writing async, your server can be more async. Yeah. And this is the thing + +00:59:09.230 --> 00:59:15.340 +that we found with ASCII is that you dispatch to a, you know, using to async or you dispatch it to + +00:59:15.340 --> 00:59:20.920 +a thread, a threadable executor, but Python doesn't run that concurrently. And so it's like, + +00:59:21.100 --> 00:59:26.460 +in parallel. So it's like, ah, it doesn't actually go as fast as you want it to. And so you end up + +00:59:26.650 --> 00:59:30.960 +wanting multiple processes still. Yeah. All right. Let's keep it with Django. Jeff, what do you think? + +00:59:33.000 --> 00:59:35.340 +I'm going to defer to the others on this. I have the least thoughts. + +00:59:37.100 --> 00:59:43.200 +All right. Write down the stack, Sebastian. Write down the video, not website, web framework. + +00:59:45.040 --> 00:59:48.200 +I think it's going to be awesome. This is going to help so much, so many things. + +00:59:48.780 --> 00:59:56.020 +The challenge is going to be third-party libraries used by each individual application and if they are compatible or not. + +00:59:56.940 --> 00:59:58.340 +That's where the challenge is going to be. + +00:59:58.480 --> 01:00:03.120 +But other than that, it's just going to be free extra performance for everyone. + +01:00:03.520 --> 01:00:06.200 +Just, you know, like just upgrading the version of Python. + +01:00:06.310 --> 01:00:07.100 +So that's going to be awesome. + +01:00:10.000 --> 01:00:10.180 +Cody. + +01:00:11.580 --> 01:00:13.160 +Yeah, I'm going to echo what Sebastian just said. + +01:00:13.220 --> 01:00:16.280 +The third-party libraries, I think, are going to be the big kind of sticky point here. + +01:00:16.840 --> 01:00:18.220 +I'm looking forward to seeing what we can do. + +01:00:18.440 --> 01:00:21.640 +I'm going to kind of hold my thoughts and let Janik kind of speak a little bit on it + +01:00:21.720 --> 01:00:23.960 +because I know that he's looked at message specs specifically + +01:00:24.240 --> 01:00:26.400 +and some of the other things that might, you know, + +01:00:26.470 --> 01:00:27.460 +give some better context here. + +01:00:27.620 --> 01:00:32.780 +But yes, the third-party libraries are going to be the kind of the sticky issue, + +01:00:33.060 --> 01:00:35.640 +but I'm looking forward to seeing what we can, you know, make happen. + +01:00:39.239 --> 01:00:44.120 +Well, I'm super excited actually specifically about async stuff + +01:00:44.310 --> 01:00:47.760 +because for most of the time it was like, you know, + +01:00:47.860 --> 01:00:53.580 +So if you can already saturate your CPU, you know, async doesn't help you much. + +01:00:53.970 --> 01:01:01.900 +Well, now, if you have proper threads, you can actually do that in async as well. + +01:01:02.540 --> 01:01:08.480 +And I think it's going to speed up a lot of applications just by default, + +01:01:08.860 --> 01:01:16.320 +because almost all async applications out there use threads in some capacity, + +01:01:16.420 --> 01:01:20.320 +because, well, most of the things aren't async by nature. + +01:01:21.590 --> 01:01:22.940 +So they will use a thread pool + +01:01:23.240 --> 01:01:26.000 +and it will run more concurrently. + +01:01:26.030 --> 01:01:29.160 +And so that's going to be better. + +01:01:29.690 --> 01:01:33.880 +But I'm also a bit scared about a few things + +01:01:34.100 --> 01:01:37.600 +that mainly, as a few others have said now, + +01:01:38.260 --> 01:01:39.780 +third-party libraries extension, + +01:01:40.440 --> 01:01:44.420 +specifically those that are Python C extensions. + +01:01:45.320 --> 01:01:56.000 +We just recently, I think like three weeks ago, also got a msgspec released for Python 3.14 and proper free threading support. + +01:01:57.440 --> 01:01:58.820 +And that took a lot of work. + +01:01:59.360 --> 01:02:06.740 +Unfortunately, a few of the CPython core devs chimed in and contributed three hours and helped out with that. + +01:02:07.440 --> 01:02:11.680 +And all around the ecosystem, the last few years, there's been a lot of work going on. + +01:02:12.580 --> 01:02:18.320 +But especially for more niche libraries that are still here and there, + +01:02:19.360 --> 01:02:28.820 +I think there's still a lot to do and possibly also quite a few bugs lurking here and there + +01:02:28.960 --> 01:02:32.420 +that haven't been found or are really hard to track down. + +01:02:33.240 --> 01:02:38.500 +And I'm curious and a bit maybe scarce, too hard of a word. + +01:02:39.720 --> 01:02:44.440 +It's going to be a little bit of a bumpy ride as people turn that on + +01:02:44.660 --> 01:02:46.420 +and then the reality of what's happening. + +01:02:47.010 --> 01:02:50.500 +However, I want to take Cody's warning and turn it on its head + +01:02:50.860 --> 01:02:52.000 +about these third-party libraries, + +01:02:52.940 --> 01:02:57.400 +because I think it's also an opportunity for regular Python developers + +01:02:57.550 --> 01:03:02.480 +who are not async fanatics to actually capture some of that capability. + +01:03:03.680 --> 01:03:05.240 +Say if some library says, + +01:03:05.420 --> 01:03:08.680 +hey, I realize that if we actually implement this lower-level thing, + +01:03:08.820 --> 01:03:11.580 +you don't actually see the implementation of in true + +01:03:11.800 --> 01:03:14.220 +threading, and then you use it. + +01:03:14.220 --> 01:03:15.300 +But you don't actually do threading. + +01:03:15.300 --> 01:03:17.560 +You just call even a blocking function. + +01:03:18.080 --> 01:03:19.640 +You might get a huge performance boost, + +01:03:20.080 --> 01:03:23.400 +a little bit like David was talking about with markup safe. + +01:03:23.960 --> 01:03:26.060 +And you could just all of a sudden, + +01:03:26.260 --> 01:03:27.960 +with doing nothing with your code, + +01:03:28.980 --> 01:03:31.180 +goes five times faster on an eight-core machine + +01:03:31.520 --> 01:03:33.720 +or something in little places where it used to matter. + +01:03:34.520 --> 01:03:37.999 +Yeah, I'm super excited for what-- + +01:03:38.020 --> 01:03:41.780 +We currently be focused on the things that are out there right now + +01:03:41.980 --> 01:03:43.840 +and that might need to be updated, + +01:03:43.910 --> 01:03:48.340 +but I'm super excited for what else might come of this, right? + +01:03:49.750 --> 01:03:51.640 +New things that will be developed + +01:03:51.890 --> 01:03:55.500 +or stuff that we are currently not thinking about + +01:03:55.530 --> 01:04:00.140 +or that hadn't been considered for the past 30 years or so + +01:04:00.230 --> 01:04:02.540 +because it wasn't feasible or wasn't possible + +01:04:02.740 --> 01:04:03.820 +or didn't make sense at all. + +01:04:04.480 --> 01:04:08.560 +And now, yeah, I think it would pay off, definitely. + +01:04:10.660 --> 01:04:13.980 +All right, Team Flask, you guys got the final word. + +01:04:14.940 --> 01:04:16.300 +Okay, Phil, do you want to go first? + +01:04:17.340 --> 01:04:18.120 +Sure, I can, yeah. + +01:04:18.940 --> 01:04:21.900 +I think it would probably be more advantageous to Whiskey apps + +01:04:22.040 --> 01:04:23.540 +than it will for ASCII apps. + +01:04:24.060 --> 01:04:25.140 +And when I've been playing with it, + +01:04:25.200 --> 01:04:27.000 +it's mostly on the Whiskey Flask side, + +01:04:27.100 --> 01:04:28.480 +where I'm quite excited about it. + +01:04:28.940 --> 01:04:31.260 +At the same time, like the others, I'm a bit worried + +01:04:31.480 --> 01:04:33.320 +because it's not clear to me, for example, + +01:04:33.440 --> 01:04:36.740 +that green threading is going to work that well with free threading. + +01:04:38.040 --> 01:04:40.080 +And that may have been fixed, but I don't think it has yet. + +01:04:40.300 --> 01:04:43.040 +And that might then break a lot of whiskey apps. + +01:04:43.980 --> 01:04:45.140 +So next, I think. + +01:04:45.190 --> 01:04:48.160 +But yeah, very excited for Flask in particular. + +01:04:51.280 --> 01:04:52.900 +Thanks for bringing up green threading. + +01:04:52.960 --> 01:04:56.600 +I added that to my notes right now. + +01:04:57.570 --> 01:05:03.400 +So Flask already has emphasized for years and years and years + +01:05:03.800 --> 01:05:06.140 +don't store stuff globally, don't have global state, + +01:05:06.600 --> 01:05:10.180 +bind stuff to the request response cycle if you need to store stuff, + +01:05:10.440 --> 01:05:11.860 +look stuff up from a cache otherwise. + +01:05:13.640 --> 01:05:15.960 +And my impression is that that emphasis is pretty successful. + +01:05:15.970 --> 01:05:19.860 +I don't think there's any well-known extensions using like global state + +01:05:19.930 --> 01:05:20.840 +or anything like that. + +01:05:22.440 --> 01:05:26.600 +It's helped that the dev server that we have is threaded by default. + +01:05:27.600 --> 01:05:29.220 +Like it's not going for performance, obviously. + +01:05:29.360 --> 01:05:30.440 +It's just running on your local machine, + +01:05:30.700 --> 01:05:33.520 +but it's already running in a threaded environment, + +01:05:33.720 --> 01:05:35.260 +running your application in a threaded environment, + +01:05:35.440 --> 01:05:37.560 +not a process-based one by default. + +01:05:38.940 --> 01:05:40.420 +I don't know if anybody even knows + +01:05:40.460 --> 01:05:42.460 +that you can run the dev server as process-based. + +01:05:43.460 --> 01:05:47.560 +And we also already had for a decade or more than decade, + +01:05:48.380 --> 01:05:50.940 +G event to enable the exact same thing + +01:05:51.100 --> 01:05:54.200 +that free threading is enabling for Flask, + +01:05:54.300 --> 01:05:58.420 +which is concurrent work and connections. + +01:05:59.820 --> 01:06:03.100 +And so plenty of applications are already deployed that way + +01:06:03.380 --> 01:06:07.140 +using Geovent to do what kind of ASCII is enabling. + +01:06:09.360 --> 01:06:13.340 +I've run all the test suites with pytest FreeThreaded, + +01:06:13.740 --> 01:06:17.740 +which checks that your tests can run concurrently + +01:06:17.740 --> 01:06:18.740 +in the FreeThreaded builds. + +01:06:20.280 --> 01:06:21.840 +So go check that out by Anthony Shaw. + +01:06:22.620 --> 01:06:24.800 +And I'm pretty sure Granian already supports FreeThreading. + +01:06:25.880 --> 01:06:26.500 +I'm not sure, though. + +01:06:26.500 --> 01:06:27.580 +I haven't looked into Granian enough. + +01:06:27.860 --> 01:06:29.120 +You know, I'm not sure either. + +01:06:29.300 --> 01:06:32.620 +It does have a runtime threaded mode, + +01:06:32.650 --> 01:06:34.640 +but I don't know if that's truly free threaded or not. + +01:06:35.280 --> 01:06:38.040 +Yeah, but all of those things combined + +01:06:38.440 --> 01:06:40.860 +make me pretty optimistic that Flask + +01:06:40.870 --> 01:06:42.620 +will be able to take advantage of this + +01:06:44.100 --> 01:06:45.680 +without much work from us. + +01:06:45.800 --> 01:06:47.420 +I mean, I know that's a big statement right there. + +01:06:47.420 --> 01:06:48.480 +I haven't tested it, + +01:06:49.690 --> 01:06:51.260 +but the fact that we've emphasized + +01:06:51.510 --> 01:06:53.680 +all these different parts for so long already + +01:06:53.960 --> 01:06:54.960 +makes me confident about it. + +01:06:55.720 --> 01:06:56.560 +Yeah, awesome. + +01:06:57.440 --> 01:06:58.720 +I'm also super excited about it. + +01:06:58.780 --> 01:07:00.740 +And just one final thought I'll throw out there + +01:07:00.940 --> 01:07:03.680 +before we call it a show, because we could go on for much longer, + +01:07:03.750 --> 01:07:04.320 +but we're out of time. + +01:07:05.300 --> 01:07:09.360 +I think once this comes along, whatever framework out + +01:07:09.360 --> 01:07:11.160 +of this choice you're using out there, + +01:07:11.960 --> 01:07:14.280 +there's a bunch of inner working pieces. + +01:07:15.260 --> 01:07:16.840 +One of them may have some kind of issue. + +01:07:17.010 --> 01:07:20.980 +And I think it's worth doing some proper load testing on your app. + +01:07:21.090 --> 01:07:24.300 +You know, point something like locus.io at it and just say, + +01:07:24.310 --> 01:07:27.240 +well, what if we gave it 10,000 concurrent users for an hour? + +01:07:27.290 --> 01:07:28.140 +Does it stop working? + +01:07:28.600 --> 01:07:30.060 +Does it crash or does it just keep going? + +01:07:30.240 --> 01:07:32.120 +You're like, it's a pretty good thing + +01:07:32.150 --> 01:07:34.700 +to do the first time before you deploy your first free + +01:07:34.780 --> 01:07:35.320 +threaded version. + +01:07:38.839 --> 01:07:39.200 +Yeah. + +01:07:40.539 --> 01:07:41.180 +All right, everyone. + +01:07:42.199 --> 01:07:43.800 +I would love to talk some more. + +01:07:43.800 --> 01:07:45.400 +This is such a good conversation, but I also + +01:07:45.640 --> 01:07:47.840 +want to respect your time and all that. + +01:07:48.260 --> 01:07:50.840 +So thank you for being here. + +01:07:51.400 --> 01:07:53.340 +It's been an honor to get you all together + +01:07:53.450 --> 01:07:54.240 +and have this conversation. + +01:07:56.240 --> 01:07:57.580 +Thank you very much for having us all. + +01:07:57.820 --> 01:07:58.120 +Thanks, everybody. + +01:07:58.520 --> 01:07:58.980 +Nice being here. + +01:07:59.200 --> 01:07:59.940 +Yeah, thanks for having us. + +01:08:00.360 --> 01:08:01.080 +Thanks for having us. + +01:08:01.700 --> 01:08:01.840 +Bye. +