PyCon 2025 Saturday - Talks
Saturday was another day of talks. At this point in a conference its not unreasonable to start feeling a bit burnt out. When I visited EMNLP last year my coworker gave me some wonderful advice:
"pace yourself. It's tempting to go to everything but it's not sustainable. Inevitably every breakfast, lunch, and dinner will be with colleagues (or folks you know from past schools and companies), which is draining"
I really appreciated that lesson and made sure to share it with coworkers this year. Especially when you are sent by your employer you can self impose a pressure to always be "on" at a conference but thats not possible. Verbally let your coworkers know to be kind to themselves so you can all collectively create that culture. You might not attend as many sessions but you will find that you are more present during the ones you do attend!
Writing Extension Modules To Be Interruptible:
I knew this would be an interesting talk when early on was the phrase "There is no magic fix, this is inherent to how it works". The talk covers that common scenario where you try to kill a python program and it.... just hangs.
One part I really appreciated was that they showed how it could be avoided but pointed out how even well intentioned people trying to do their best can miss the part in the docs or miss understand it because its confusing.
Scaling the Mountain: A Framework for Tackling Large-Scale Tech Debt:
This was interesting because it covered both the technical and corpo-politics aspects that can make tackling tech debt harder than it needs to be. That potent combination of lack of visibility meets no clear accountability.
They have a repo for the project available.
GPU Programming in Pure Python:
The speaker talked about wanting to change the perspective that parallel is not default and shouldn't be used cautiously because GPUs are not exotic anymore. I liked how that sentiment was tempted with the idea that you need a workload that works for parallelism, and overcomes the fixed cost:
"If you're doing linear amount of compute like summing you need gbs of data for it to payoff... If you're sorting or doing matrix multiplications then you need less data than that"
The Zen of Polymorphism: Choosing between isinstance(), methods, and @singledispatch:
I appreciated this talk and will avoid the temptation to just dump a bunch of my notes here. The recording will do better justice to the idea.
The example use case was and AST for simple calculator. We walked through four techniques for evaluating the nodes:
- one big function w/
isinstance
- object oriented programming
- dynamic Dispatch using
@singledispatch
- catamorphism with
@traverse.register
Brett was very fair comparing the trade-offs on the different approaches and when to pick one over the other. Especially if you are more worried about jumping around when writing the implementation vs jumping around when debugging. Plus I love getting to hear fun language theory phases like catamorphism.
Elastic Generics: Flexible Static Typing with TypeVarTuple and Unpack:
The actually use case in this talk felt like a stretch but I enjoyed learning about TypeVarTuple
and unpacking. I know I just said that I love language theory but I feel like Python is not really the language to go super deep into it. A lot of the typing feels bolted on.
Notes, Groups and Stars: exceptional new features of Python 3.11!:
My brain was starting to turn into goo at this talk so I want to revisit it but the presenter was very fun: "I'm sad when I look at this code, and I wrote it."
What they don't tell you about building a JIT compiler for CPython:
The speaker commented that they gave a talk last year with the same title so this one hould have a red asterisks next to its title.
I had heard about the Fast Python initiative but hadn't really dug into this. This talk was a nice overview of what had be done so far and where the JIT pieces are going. Also the fun line: "If you're a rust programmer this is what we call unsafe code"