• 2 Posts
  • 89 Comments
Joined 1 year ago
cake
Cake day: December 11th, 2024

help-circle
  • In the last 5 years we have blocked access to the internet through bot checks and ad networks. We block screens with ads or consent forms which aren’t even enforceable and make the user bark on command to get past.

    We have become trained seals. Captcha popped up? Better bark. Cloudflare challenge appeared? Better bark like the trained seal and click the checkbox.

    Nobody even talks about this. The fact these things are annoying is constantly discussed but I have yet to see any article covering the fact we have trained our species to bark when asked without question and what the ramifications are down the road.

    There isn’t even a planned offramp from this trajectory, these things are going to get more pervasive and annoying while technology improves. Where doea this actually end?




  • We have an enormous problem with software optimization both in cycles and memory costs. I would love for that to change but the vast majority of customers don’t care. It’s painful to think about but most don’t care as long as it works “good enough” which is a nebulous measure that management can use to lie to shareholders.

    Even mentioning that we’ve wiped out roughly a decade in hardware gains with how bloated and slow our software is doesn’t move the needle. All of the younger devs in our teams truly see no issue. They consider nextjs apps to be instant. Their term, not me putting words in their mouths. VSCode is blazingly fast in their eyes.

    We’ve let the problem slide so long that we have a whole generation of upcoming devs that don’t even see a problem let alone care about it. Anyone who mentors devs should really hammer this home and maybe together we can all start shifting that apathy.




  • I think pretty much every dev understands the issue but they are limited in what they can do about it. Quitting a job because they won’t let you optimize is noble but unrealistic for the vast majority of devs.

    I would love for optimizations to start being prioritized. More specifically, I would love to see vendors place limits on memory use in apps. For example, Steam could reject any game over 50gb. I do not believe for a moment that any game we currently have needs more than 50gb except maybe an mmo with 20 years of content. Or Microsoft could reject apps that use more than X ram. They won’t ever do that but without an outright rejection, this won’t be fixed.


  • Unless you want to get fancy for the sake of not being fancy, you will likely be best just sticking with Kate.

    Basic editing can be done in vi or nano or even piped to a file via she’ll. I don’t think any of those are necessarily better or worse than using Kate. Vi and nano would probably be faster but you would need to be in a terminal already.

    That said, I am curious as well if anyone has a better answer.



  • Holy shit, ok I’ll state it yet again and then I’m done. For the 3rd time, this isn’t about who can afford what drives or who has what drives or what drives exist in our universe. Pretend drives doesn’t exist if that is easier because the drives don’t matter. Drive space is a symptom of the underlying issue.

    This is about the near universal trend of software companies destroying a decade plus of hardware performance gains because they refuse to properly optimize their software. Full stop. Anything else is a side effect of not properly optimizing things. The drive type arguments, drive space arguments…they disappear once the fundamental issue (optimization) is addressed.

    Holding these companies accountable is how this gets fixed. It’s how this particular instance got fixed. This thread wouldn’t even exist if these weren’t legitimate complaints because the devs wouldn’t have bothered with this round of size reduction if there wasn’t a problem affecting their bottom line.


  • They need 15% of that space.

    It isn’t about the drives or if people can afford a certain size SSD (although that matters more than you are admitting). It’s about asking better of people and companies responsible for destroying over a decade of hardware advancement due to bad optimization (to save money), overly-abstracted frameworks (to save money), and improper handling of assets (to save money).

    They made a mistake in not prioritizing this, they fixed it and admitted there was a problem. That’s good. What isn’t good is people polishing their knobs as if the devs did this out of the goodness of their heart. They didn’t, they were losing players because people were speaking with time and money. Just like you wanted. Just like people in this thread are doing.

    But glancing through your other replies I’ll just stop here. May your drive space remain vast, and your tolerance for badly-optimized software remain stronger than mine.


  • No the issue isn’t with the user having a 400GB drive. The issue is the devs chose to leave 131GB of unnecessary duplication in the game artifacts when published. That is a fundamental problem with software in general but games especially.

    Blaming the customer for expecting to have a decent product is laughably misplaced. Part of software being a decent product is it being optimized. This was an absolute failure and they should really be putting out an apology instead of patting themselves on the back.

    It’s great the game is so much smaller now but it should have been this new size at launch. Certainly not 131GB bigger than it needed to be.


  • What’s funny is if you added another “level” to this going back another 15 years there would be someone complaining about the same things but with Java as the target. “Java is slow” wasn’t just a joke for no reason after all.

    There are some funny parts in the post as well as some true statements to the current state of things. We’ll see another post like it in 10-15 years and it will be a chuckle. Then we’ll all continue as we always have and deal with whatever comes down the pipe next.

    It’s what humans do and it isn’t restricted to technologies or programming languages.




  • This is a true statement but it also doesn’t accomplish anything. As much as you want others to care about something, they have quite literally a whole world of stuff going on in their head and these things are not necessarily priorities. They should be, but they aren’t.

    We need to keep in mind we are in an echo chamber and as important as these things are for us, we are in the minority. It isn’t because people don’t care, they are just busy with their own gremlins.

    It’s a problem but also very human.


  • Can confirm, there is a world of difference between people who are chatting about switching to Linux and the average computer user.

    As much improved as it is, Linux isn’t ready for those people. Not because it is hard or they can’t figure it out but they don’t care or don’t have the energy. Most people don’t even know what Linux is there than a term they might have heard a couple times.

    I would love for Linux to take off and Microsoft to feel the sting from abusing their customers.



  • You know how with libraries you go, find a book, and then check it out and take it home? After some time you have to take it back and return it and then you can either get another book or you can renew your checkout and keep the book a bit longer. This is what is happening when people are registering domains. It’s also what’s happening when an organization applies to own a new top level domain.

    Let me explain. There are two different things this question could be in reference to: registering a domain and creating an entirely new top level domain (like .com, .net, .edu). Let’s start with creating an entirely new top level domain:

    So the tip top level is run by ICANN which maintains the “golden standard” list of who owns what domain and top level domain. This organization will only point traffic to you if you are registered with them for the top level domain. This is very expensive, time consuming, and has a thorough vetting process before you are approved. If you are approved, they will point traffic to you and you can then point the traffic to the appropriate domain. There is a process to maintain ownership of the top level domain so you need to keep the registry up to date to keep ownership including paying fees and maintaining certain standards and paperwork.

    The next level is a company that works in the middle of the owner of the top level domain and the average person who wants to register a domain. GoDaddy is an example of a company like this. They work with the owners to hand out domains for a fee. This leads to the next level of your question:

    Much like the tip top level will register top level domains to organizations/businesses/etc., those same organizations/businesses/etc. can then turn around and sell any combination of characters before the top level domain. For example, if you owned .mybutt and it was approved and active, then anyone who wanted a domain that ended in .mybutt would need to be approved by you. Registering a domain at this level is generally pretty cheap compared to Top levels and most people pay just a few bucks for them. (with some exceptions)

    You are the library in this scenario and the books are the domains. You can check out domains to people but they have to bring them back at some point or keep paying.

    You in turn go to ICANN (a higher level library) to checkout a top level domain that you can then control.