The solution was to make sure to unzip the Steam Deck .bz2 first, even though Balena Etcher recognizes the .bz2 file as a disk image in the file selector, and doesn’t provide any errors outside console. It just appears stuck at 0%.
Google, Ookla, speed test providers are known to be prioritized by ISPs.
Show you fun numbers! See its says 600Mbps up! Cool! Most people will see “Okay I see i’m getting 600Mbps up!” and then feel good about their service.
But the real upload rate is different. Obviously its in their business interest to throttle and such, there’s that FCC Net Neutrality Rules (2017) Ajit Pai debacle.
Anyways the focus of this post is basically, under the current scheme we’re in, trying to calc the *actual* download/uplink as a “layman.”
At least as of Ventura 13.5, built in terminal command:
You’ll notice a difference between public test sites, which ISPs are known to prioritize traffic to to fake their upload capacity for average consumers googling “speed test”.
In my google/ookla, I get 600 Mbps up.
Someone had a trick to ping speedtest.net to trick the ISP to think you’re doing a fast speed test. Even pinging speedtest.net + running a local check does seem to change results.
Router speed test
The router has up/down accuracy. This in incredible.
But, we’re back to the “ISP prioritization.” — it’s likely they have prioritized that.
So how do we really find out?
Why don’t we just upload a couple gigs and see how long that takes?
Just uploaded to dropbox a 2GB file ~ took about 20 seconds.
2,000MB / 20s = 100MB/s
Convert to megabits: 800Mb
So looks like uploads on dropbox are working at intended speeds.
Okay that napkin math checks out.
So despite the local network test showing 60Mbps, I can napkin math uploading to dropbox was at 800Mbps.
Summary?
Clearly ISPs want to prioritize tests, but it’s not necessarily nefarious. Clearly various local tooling can show low uplink due to ISP not prioritizing it. But it makes sense, they need to distribute bandwidth in a way that’s most efficient..Poking holes and pointing fingers isn’t really productive.
I’m glad I did a manual upload i.e. “a real 2GB load” was prioritized / run at ~800Mbps (on wifi mind ya.)
That tells me I’m getting the speeds when I need it, and I’ll just have to trust my internet overlords they have our best interests at heart. And to be honest, I was surprised that when I uploaded something I actually needed to vs a synthetic test, it DID perform 10x.
This one was tough to debug. I upgraded turbo to the latest, and also added new “globalDependencies” which controls the cache hash for all pipelines.
The problem was that I had placed “globalDependencies” inside of the “pipeline” on accident, and somehow just missed it. A better error would have helped, because the error is so generic people have this error everywhere. Low signal to noise ratio.
I adopted Turborepo super early so a lot of setup is old. I added the JSON schema:
{"$schema": "https://turbo.build/schema.json"} which the linter immediately called out that values in the pipeline object should be objects, which finally brought my attention to the fact globalDependencies was not at the root.
Lesson of the day? Add your schema, get some free lint.
If you get this error, it may be because you imported `getIronSession` without using it. There seems to be some magic that validates whether you are using it correctly, even if you’re not using it at all, and using the api withIronSessionApiRoute wrapper.
I think it’s odd this is not obvious to do, but the only way I have found to disable a project connected to a repo from auto building every commit is to go into the “ignore build” script, and type in a bash script that exit codes 0.
Ignored build settings: exit 0;
That’s it!
This needs to be added as a feature, because some of Vercel’s features are project-wide and can’t be tested separately from your main project.
For example, enabling Vercel Deployment Protection, which requires a login/password etc. applies to the whole project, meaning your CI builds will fail as you try to update each one with a bypass method.
Therefore, the only way to have a live site and a test site to test/work on CI bypasses is to create a second project connected to the same repo.
But what if you don’t want to double your builds for a moment? Then see above…
You can also disconnect the repo, but you will lose branch specific env vars
I suddenly had to agree to XCode terms to use Git. It would repeat over and over installing updates.
The answer was to fire up XCode, finish install of updated stuff (likely related to iPhone launch) , and done. Ignore the update warning, just open XCode.
My Studio Display has been freezing on me ever since I got it, no matter what firmware. I accidentally found the cause.
I’ve bought no less than 10 Logitech G305 Lightspeed mice. It’s as fast as wired, but wireless. I’ve had many different G305s plugged into my monitor, because it’s my go-to mouse.
I bought a wired one so I have a backup if I lose the dongle (which I do frequently, hence the 10 mice) and I haven’t had a freeze in days.
If you have a high speed mouse dongle plugged into your studio display, and it freezes once a day… try a wired one.