I switched to Linux in the spring of 2025. Not as an experiment. Not because I wanted to learn something new. I just decided I was done with Windows and wanted to make a clean break. The kind of decision you make on a Tuesday evening that feels completely rational in the moment and absolutely terrifying by Wednesday morning.
The first week was a crash course in discovering what I'd been taking for granted. On Windows, there's an app for everything. You Google "best IPTV player for Windows" and you get a curated list from 2024 that's still mostly relevant. The ecosystem is enormous, the categories are well-established, and if something exists anywhere it exists there.
I run Arch, by the way. (Yes, I know what the meme is. Yes, I mention it anyway.) On Arch, the app landscape is different. Not worse - different. Some things are genuinely better. Some things don't exist at all, or the alternatives require you to care deeply about config files and have opinions about init systems. App discovery is less Googling and more archaeology.
And when it came to IPTV players, I found almost nothing.
The gap that nobody had filled
I have IPTV. A proper subscription with a large playlist. We're talking north of 200,000 channels across live TV, VOD, and series. On Windows I'd been using a dedicated player that handled the whole thing without drama. On Linux, I started searching for equivalents and kept finding the same things: VLC (which can technically play IPTV streams but isn't really an IPTV client), Kodi (which is technically everything but requires you to set up an entire media center ecosystem just to watch TV), and a handful of abandoned projects with last commits from 2019.
Then I found Open-TV.
Open-TV is a lightweight IPTV player built with Tauri and MPV. It does the core things well: import M3U playlists or Xtream Codes credentials, browse channels, click to watch. Clean UI. Fast. It's the kind of project that exists because someone had the same problem I did and actually solved it. I installed it, imported my playlist, and it worked. That alone put it ahead of most things I'd tried.
But I watch a lot of content in Swedish. My IPTV provider sends multiple audio tracks for a lot of channels - Swedish, English, original language. Open-TV had no way to set a preferred audio language. It also had no subtitle language preferences. Every time MPV opened, I was back to manually cycling through tracks.
Small problem. Annoying problem. The kind of problem that makes something go from "good enough" to "not quite good enough."
The pull request that went nowhere
My first instinct wasn't to build my own app. That would be excessive. I forked Open-TV, added language settings for both audio and subtitles, wired them into the MPV arguments that Open-TV passes when it opens a stream, and submitted a pull request.
It felt like a clean contribution. The feature was small, focused, and something the project was clearly missing. I wrote up the PR, explained the changes, and waited.
A week passed. Nothing.
Two weeks. Nothing.
A month. The PR sat there with zero comments, zero review, zero acknowledgment. Not rejected. Not acknowledged. Just open, waiting, and gradually feeling like it was shouting into a void.
I don't blame the maintainer. Open source projects are built and maintained by people who have jobs and lives and finite hours. A PR from a stranger sitting in the queue isn't a moral failure. It's just how open source works sometimes. But understanding that intellectually didn't make the waiting feel better.
After a month of my fork sitting on my machine while the upstream PR collected dust, I made a decision that felt impulsive at the time but probably wasn't.
I'll just build my own.
Frustration-driven development
This is, by the way, an absolutely terrible reason to start a project. Building your own version of something because a PR went unmerged is classic scope creep driven by ego. You end up rebuilding things that already work just because you're annoyed. Any sensible person would continue using the fork locally, keep the PR open on the off chance it gets picked up, and move on.
I am not always a sensible person. I started writing Better-IPTV in November 2025.
The first question was what stack to use. Open-TV had given me a decent introduction to Tauri, which I'd never used before. Tauri is a framework for building desktop apps with a Rust backend and a web UI frontend. The pitch that got my attention was binary size: a Tauri app compiles to a native binary with a small footprint, as opposed to Electron which bundles an entire Chromium instance and results in binaries that are genuinely enormous. I'd experimented with Electron for cross-platform apps before and I actively disliked how bloated the outputs felt.
Tauri meant Rust. I'd read about Rust more than I'd written it. The reputation is well-established: steep learning curve, notoriously strict compiler, but reliable and fast once it compiles. Would I have chosen Rust if I was optimizing for getting something working quickly? Probably not. But I'd already decided on Tauri and Tauri meant Rust, so here we are.
For video playback, the decision was easy. MPV is exceptional. It handles every codec you'll encounter in an IPTV context, has hardware acceleration, supports every subtitle and audio track scenario I could think of, and is maintained by people who clearly care about it deeply. Reinventing that would be insane. Better-IPTV launches MPV as an external player - the app manages your library and playlist, MPV handles the actual playback. Clean separation, no compromises on codec support.
I credit Open-TV for this architectural insight. Open-TV made the same choice, and it's the right one.
The 200,000 channel problem
My personal IPTV playlist has somewhere around 200,000 entries. Live TV, movies, series - all in one M3U file. This is absurd, and I'm fully aware of that. Most people have a few thousand channels at most. But my provider packages everything together, and I'd never bothered to trim it.
This immediately became a technical constraint I had to solve before anything else. You cannot render 200,000 channel cards in a browser-based UI without making the whole thing unusable. The DOM would groan under the weight of it. Scroll performance would be catastrophic.
The solution is virtual scrolling. Instead of rendering every channel card in the DOM, you render only what's visible on screen plus a small buffer above and below the viewport. As you scroll, the DOM updates to reflect what should be visible at the new position. The total number of DOM nodes stays constant regardless of list size.
I used @tanstack/react-virtual for this. All channels are loaded from the local SQLite database into memory - the database handles the heavy lifting of storing and querying them efficiently - but only the visible rows get rendered. Whether you have 1,000 channels or 200,000, the UI behaves the same way. I developed the app against my own playlist the entire time, which meant any performance regression showed up immediately in my daily use.
This is one of the things I think I got right early: building something for the most extreme use case I had access to. If it handles 200,000 channels without complaining, it's going to be fine for everyone else.
Shipping v2.0.0
The first stable release came out in November 2025. Live TV, movies, series, EPG support, dark and light themes, Linux and Windows builds. The macOS build was there too, though I don't have a Mac to test on, which I disclosed openly in the README.
I called it v2.0.0, which is an odd version number for something that had never been public before. The reason is that there was a v1.
After I decided to build my own player, I did exactly that - built it, ran it on my own machine, iterated on it, and lived with the choices I made early on. Some of those choices were fine. Some of them were the kind of thing you only recognize as a mistake once you've been looking at them long enough. v1.x was a private project, built for me, and it taught me what the app actually needed to be.
Before I considered making it public, I rewrote it. Not a refactor - a proper rewrite, starting from what I'd learned. The architecture was cleaner. The code made more sense. I had a clearer idea of what features mattered and what I'd added on impulse. v2.0.0 was the version I was willing to put my name on.
This is why the GitHub history starts at v2.0.0. There is no public v1. It exists only on my machine, as a record of the first pass, which is probably where it belongs.
It shipped with the subtitle and audio language preferences that started the whole thing. That felt appropriate.
Giving it a home
A desktop app without a website feels incomplete in a way that didn't bother me until it did. The GitHub README is fine for developers who find the project through search or links, but it isn't a landing page. It doesn't tell a stranger in three seconds what the app does or whether they should bother downloading it.
I wanted something static - no server, no database, just files that Vercel could serve. I'd been hearing good things about Astro for a while. The pitch is essentially: build with components, ship zero JavaScript by default, get a fast static site without having to think too hard about it. For a project page that's mostly content with a download button, that sounded exactly right.
I built it in TypeScript + Astro with Tailwind, with some guidance from Opus 4.6 to fill in the gaps where I wasn't sure how Astro expected things to work. I'm not going to pretend I didn't use AI here - I mention it in the spirit of the same honesty I try to apply everywhere else. Astro was new to me. Having something to ask questions to while I learned the file structure and component model made the process faster.
The one Astro feature that impressed me most was genuinely simple to implement but solves a real problem: the page fetches the latest release version from the GitHub API at build time. This means the download section always shows the current version number and links to the correct release assets without me having to update anything manually. Every time a new release goes out, a CI trigger fires a Vercel rebuild, the page fetches the new release data, and the site updates automatically. Static site with build-time data fetching. It sounds obvious once you see it, but it's the kind of pattern that makes you appreciate what Astro was designed for.
The build output is about 330KB. Around 260KB of that is images - screenshots, the OG image, logos. The CSS comes in at 24KB. The HTML is 40KB. JavaScript shipped to the browser: zero bytes. That last number is the one Astro is actually proud of, and rightfully so. A project page does not need JavaScript. It needs text, images, and links, and Astro lets you build that without accidentally shipping a framework.
The site is at better-iptv.vercel.app. It does what a project page should do: explain what the thing is, show some screenshots, answer the most common questions, and get out of the way.
Opening the code
Releasing it as open source was not an automatic decision.
I have imposter syndrome. The specific flavor of it where you build things constantly while being convinced that you're not a real developer and definitely shouldn't be showing your code to actual developers. You ship things for personal use because the use is real, but the idea of someone else reading the code feels exposing in a way that's difficult to fully rationalize.
There's a particular flavor of it that affects projects like this. Better-IPTV was entirely self-taught, built in a language I was learning as I went, using a framework I'd never touched before. The professional software developer in my head - the one who doesn't actually exist but has strong opinions about everything I do - had a lot to say about the commit history, the architecture decisions made under uncertainty, the places where I clearly had no idea what I was doing and wrote something that worked anyway. Releasing the code meant inviting actual software developers to look at all of that.
There's also a more practical reason this project exists publicly at all. In November 2025 I quit my job after ten years as a lead technician. Six months of paid leave, no job lined up, and suddenly the kind of time and mental space I hadn't had in years. Without that, Better-IPTV would almost certainly still be sitting on my machine as a private tool I built for myself and never got around to cleaning up. The imposter syndrome doesn't go away when you have more time, but the excuses run out faster. I couldn't tell myself I was too busy.
I released it anyway. Partly because GPL v2 left me little choice if I wanted to distribute binaries. Partly because keeping it private felt cowardly in a way I didn't want to be. And partly because the project was genuinely useful and I figured other Linux users with large IPTV playlists probably existed and might want it.
The code wasn't embarrassing, exactly. It worked. It handled the use cases I cared about. But it was also a project built over a few weeks by someone learning Rust and Tauri at the same time as shipping something. There are rough edges in there. Commit messages that say "Stuff" and "fixed stuff." Approaches I would choose differently today. The honest archaeology of someone learning by doing.
I released it as GPL v2.0 because MPV is GPL v2, which means anything distributing with MPV needs to be GPL-compatible. It's not a constraint I resent - I think open source is the right home for a tool like this. But it pushed me toward making it properly public rather than just dropping binaries on GitHub and hoping no one looked at the source.
I pushed the code, published the release, and waited to see if anything happened.
What happened after release
Things got better fast, mostly because real-world feedback is more useful than imagining what users might want.
Multi-profile support came in v2.1.0 a couple of weeks after launch. I run multiple IPTV subscriptions - my main one and a backup from a different provider. Switching between them in the original version required re-importing everything, which was tedious. Multi-profile let you manage separate playlists with independent channel lists, favorites, and settings. Once I built it for myself, it was clearly something anyone with more than one provider would want.
Parental controls came in v2.3.0 in December. This one came from thinking about who else might use this. I built the app for myself, but "myself" includes the fact that I have children and a large playlist that contains content absolutely not suitable for them. The parental controls ended up being more comprehensive than I originally planned: PIN protection with Argon2 hashing, manual channel blocking, automatic detection of channels with adult markers in their names, category-level blocking, and three display modes (hide completely, show with a lock icon, show blurred). Scope creep in the best possible way.
Then came the Arch Linux AppImage situation.
AppImages are supposed to be self-contained. You bundle everything the app needs, and it runs on any Linux distribution without requiring installation. The catch is that "everything" includes WebKit, which is what Tauri uses for its UI layer. The version of WebKit bundled in a typical AppImage build is compiled against Ubuntu's system libraries. On Ubuntu, this is fine. On Arch, which is a rolling-release distro with newer system libraries, the bundled WebKit from the Ubuntu-built AppImage conflicts with what's on the system. The app either crashes immediately with a cryptic EGL error or refuses to start entirely.
The fix was to provide two AppImage variants: one that bundles WebKit for Ubuntu-based systems, and one that uses the system WebKit for Arch-based systems. It means maintaining two build targets, but it means the app actually works on both. The AUR package automatically uses the correct variant, so Arch users who install through the AUR get the right build without having to think about it.
This is the kind of Linux-specific quirk that you only discover by actually running Arch as your daily driver. If I'd been developing on Ubuntu, I probably wouldn't have hit it until someone filed a bug report. Running the thing I'm building, on the system I use, means I find these problems before they become other people's problems.
In December, alongside v2.2.0, I put the project on the AUR. I knew what the AUR was - I use Arch, I've installed things from it hundreds of times - but I had no idea what it actually meant to get something listed there. The mechanics of it are not complicated once you understand them, but they're also not obvious. PKGBUILD files, .SRCINFO, the AUR SSH submission process, keeping the package in sync with releases. I went in assuming it would be straightforward and came out having spent a few hours reading documentation I hadn't anticipated needing to read. The Arch wiki is thorough, at least.
It's available now as better-iptv on the AUR. Arch users get the correct AppImage variant automatically, without having to think about which build is right for their system.
What I'd do differently
CI/CD from day one. I set up proper pipelines partway through development, and retrofitting them is always more painful than building them in from the start. The release workflow for a Tauri app is not trivial - compiling for multiple targets, signing binaries, generating the AppImage variants, keeping the AUR package in sync. I assembled it incrementally in ways that left rough edges I'm still cleaning up. Starting with a proper automated release pipeline would have saved real time.
I would have polished the project more before making it public. Not polished in the sense of adding features - polished in the sense of cleaning up the early commit history, removing the "Stuff" commits, writing proper initial documentation, and generally making it look like something I was confident in before I put my name on it. I released it when it worked rather than when it was presentable, which was probably the right call for my own motivation but made it harder to take seriously as something other people should use.
The Rust question is one I still think about. Rust was the right choice in the sense that Tauri requires it and the backend is fast and reliable. It's the wrong choice in the sense that I was learning it during development, which slowed me down significantly and meant I was making architectural decisions without fully understanding their implications. I've gotten better with it over the course of the project, but if I was starting fresh today I might consider whether there was a way to use a backend language I already knew well. The answer is probably still Rust because Tauri is genuinely the right framework for this, but I would at least think harder about it.
I also underestimated how much work cross-platform actually is. The pitch of "write once, run anywhere" is always more complicated in practice. Windows needs bundled MPV (since v2.3.0, that's handled automatically). Linux needs two AppImage variants. macOS needs different signing and distribution handling. None of these are blockers, but each one took time I hadn't budgeted.
What it is now
Better-IPTV is at v2.5.0 as of late February 2026. 143 commits. Linux, Windows, and macOS. Virtual scrolling that handles 200,000 channels without drama. Multi-profile support. Parental controls with PIN and auto-detection. EPG. Favorites. Keyboard shortcuts. Playlist auto-refresh. Custom user-agent settings for providers that require it. Available on the AUR for Arch users.
It handles everything I personally need from an IPTV client on Linux, which was always the actual goal.
The thing about building something because a PR sat ignored is that the frustration runs out pretty quickly once you're actually building. The annoyance that started it became irrelevant after the first week. I wasn't thinking about the unmerged PR anymore - I was thinking about virtual scrolling performance and Wayland compatibility and how to handle channels whose names contain forward slashes. The origin story becomes trivia. The software becomes the thing.
I don't know if Open-TV's maintainer ever saw the original PR. Maybe they did, looked at it, had their reasons for not merging it, and moved on. That's fine. The project is theirs and what they include in it is entirely their call. Better-IPTV is a separate project, built from scratch - not a fork. But it wouldn't exist without Open-TV giving me a reference point for what a good Linux IPTV player looked like, and without the frustration of a PR that went nowhere pushing me to build something of my own.
The subtitle language preference that started all of this still works exactly the way I wanted it to.
Better-IPTV is on GitHub at github.com/mewset/better-iptv. Arch users can install via yay -S better-iptv. Linux, Windows, and macOS binaries on the releases page.