Connecting to the Network of the Future – Macs don’t like Realtek – Network Upgrade Frustration

2024/05/10

the network is the new bottleneck —when does slow get too slow? —push it under the rug with a good connection!

Note: I’m putting this post out just in case someone ends up in the same situation as I did. If you somehow reached this page because you are looking for an USB-C/thunderbolt dongle or adapter for speeds above 1Gbps on your mac, you can save yourself some time and directly buy the right thing.

Frustration About my Network Upgrade

Last year, when moving into my current flat, I got the option to get 10Gbps internet at no extra monthly cost. I only needed another router. So I figured out, hey, why not? I don’t need that much bandwidth, but docker containers and programming language dependencies aren’t exactly light, so a higher bandwidth certainly can’t harm.

My main development machine is a Macbook pro, which I tend to plug into an ethernet cable via an USB-C dongle when working from my desk (more on why I prefer that to Wi-Fi below).

For years, I’ve used an LMP dongle that offers 1Gbps. It was mildly satisfying because of the heat it generates, but it otherwise works.

Before ordering the 10Gbps plan I quickly checked if dongles for speeds over 1Gbps are common and how much they cost: it seemed like there were multiple options for 2.5 and 5Gbps at least, so I went ahead.

However, I had not counted on the finicky nature of (bad) network hardware drivers.

Long story short

So, I bought a new adapter. Tried it: it only reported 1Gbps. Figured it might be the cable. Tried out three different cables. Went for another dongle. Same result: failure.

I then started googling, and found out:

The above was pretty disapointing, but! there is a solution! Your easiest option is to find an adapter that:

You can skip the frustration and directly go for a Sonnettech adapter. Their Solo2.5G costs roughly 20$ (cheaper than most others that won’t work) and you can find it on Amazon in the EU.

(I just saw their 10Gbps connector is in stock again and I’m wildly tempted, but at 200$+ I might need some more self-convincing. OWC seems to offer something similar, too, but do check proper support for your OS/hardware combination…)

Network of the Future

Now, you may wonder, why would I even want more than 1Gbps on my local machine? And why would I bother with a cable?

I remember a quote (probably from Joel Spolsky, but I can’t get my hands on it right now) that roughly went like:

To build the software of the future, you need the computer of the future.

If I recall correctly, he was advocating for employers to get (very) good hardware for their developers.

I’m starting to think that the same applies to bandwidth.

Make it go fast.

No. Faster.

One aspect of being productive as a developer (at least for me, anyway), is to stay in flow: the shorter the wait between trying something out in code and seeing the result, the better.

That’s why I strongly advocate for keeping build times slow and chosing build systems for their speed (or at least spending some time to optimise build an CI times).

Hardware producers have gifted us with ever faster machines: IDE’s and compilers followed suite and got decent at leveraging them. As a result, it is now pretty common to have the network become a bottleneck when building software.1

After all, you need to download all these dependencies, and while most build systems are decent at caching them, there are other tools and technologies that just love to eat your bandwidth: depending what you work on, you may rely on docker to build and run your code, and it’s not uncommon to end up with images of several hundreds of megabytes.

And if you’re into the latest LLM craze and considering trying out some models locally, you’ll learn about the pleasures hof handling downloads of 200Gb+ files.

To sum it up: we (at least I…) need the network a lot. It takes time, increases the duration of my feedback loops and takes me out of flow.

Hence, I just want more of it2.

Why Wired

Following from the above, there are two key aspects I care about for my network connection:

Throughput, because I don’t want to wait too long when I’m deploying containers to test environments, downloading big files or re-buildin a docker container that happens to require half of the universe;

Latency, because when I’m re-building something from within a docker-container that is not benefitting from any caching, higher latencies make downloading even small dependencies slow1.

Although Wi-fi got incredibly good over the years, for both of these the best guarantee remains a wire: Wi-fi’s theoretical throughput is now above 1Gbps, but that’s still very dependent on how many users are on the network and on your neighbors. Same goes for latency.

May your builds complete swiftly! (and correctly…)

For reference, here are some of the links I could salvage from last year’s browsing history:


  1. I realise it might not be that common. But it so happens that both projects I’m currently working on have a tendency to be network hungry for their builds, and while we did pick the proverbial low-hanging optimisation fruits, there is a longer tail of issues that are best pushed under the carpet with a good internet connection. I usually forget about those until I need to run a build while connected through my phone on the train. ↩︎

  2. Note that while I advocate for developers to have good hardware and connectivity, I’d still encourage anyone building tools to make them efficient in their use of resources: at some point faster machines and networks can’t hide away everything, and the assumption that everyone has a huge budget is wrong. Dan Luu questions the tendency to expect web users to have both powerful hardware and good connectivity, and the same point can be made about developers of more modest means being excluded from certain tech stacks due to more limited hardware. ↩︎