Hard to believe it’s been 24 years since Y2K (2000) And it feels like we’ve come such a long way, but this decade started off very poorly with one of the worst pandemics the modern world has ever seen, and technology in general is looking very bleak in several ways

I’m a PC gamer, and it looks like things are stagnating massively in our space. So many gaming companies are incapable of putting out a successful AAA title because people are either too poor, don’t want to play a live service AAA disaster like every single one that has been released lately, Call of Duty, battlefield, anything electronic arts or Ubisoft puts out is almost entirely a failure or undersales. So many gaming studios have been shuttered and are being shuttered, Microsoft is basically one member of an oligopoly with Sony and a couple other companies.

Hardware is stagnating. Nvidia is putting on the brakes for developing their next line of GPUs, we’re not going to see huge gains in performance anymore because AMD isn’t caught up yet and they have no reason to innovate. So they are just going to sell their next line of cards for $1,500 a pop for the top ones, with 10% increase in performance rather than 50 or 60% like we really need. We still don’t have the capability to play games in full native 4K 144 Hertz. That’s at least a decade away

Virtual reality is on the verge of collapse because meta is basically the only real player in that space, they have a monopoly with them and valve index, pico from China is on the verge of developing something incredible as well, and Apple just revealed a mixed reality headset but the price is so extraordinary that barely anyone has it so use isn’t very widespread. We’re again a decade away from seeing anything really substantial in terms of performance

Artificial intelligence is really, really fucking things up in general and the discussions about AI look almost as bad as the news about the latest election in the USA. It’s so clowny and ridiculous and over-the-top hearing any news about AI. The latest news is that open AI is going to go from a non-profit to a for-profit company after they promised they were operating for the good of humanity and broke countless laws stealing copyrighted information, supposedly for the public good, but now they’re just going to snap their fingers and morph into a for-profit company. So they can just basically steal anything they want that’s copyrighted, but claim it’s for the public good, and then randomly swap to a for-profit model. Doesn’t make any sense and just looks like they’re going to be a vessel for widespread economic poverty…

It just seems like there’s a lot of bubbles that are about to burst all at the same time, like I don’t see how things are going to possibly get better for a while now?

  • Telorand@reddthat.com
    link
    fedilink
    English
    arrow-up
    198
    arrow-down
    3
    ·
    2 months ago

    I’m a PC gamer, and it looks like things are stagnating massively in our space.

    I would like to introduce you to the indie game scene. Where AAA is faltering, indie has never been in a better place.

    Overall, I don’t see things the way you see them. I recommend taking a break from social media, go for a walk, play games you like, and fuck the trajectory of tech companies.

    Live your life, and take a break from the doomsaying.

    • Lvxferre@mander.xyz
      link
      fedilink
      English
      arrow-up
      55
      ·
      2 months ago

      I would like to introduce you to the indie game scene. Where AAA is faltering, indie has never been in a better place.

      Amen.

      Indie games might not be flashy, but they’re often made with love and concern about giving you a fun experience. They also lack all those abusive DRM and intrusive anti-cheat systems that A³ games often have.

      • Rob Bos@lemmy.ca
        link
        fedilink
        English
        arrow-up
        26
        arrow-down
        1
        ·
        2 months ago

        They also tend to have linux support. Where the AAA companies want to eat the entire mammoth and scorn the scraps, small companies can thrive off of small prey and the offal. :)

          • Lvxferre@mander.xyz
            link
            fedilink
            English
            arrow-up
            10
            ·
            2 months ago

            It’s a great analogy though - Linux users aren’t deemed profitable by the A³ companies, just like offal is unjustly* deemed yucky by your typical person.

            *I do love offal though. And writing this comment made me crave for chicken livers with garlic and rosemary over sourdough bread. Damn.

            • sugar_in_your_tea@sh.itjust.works
              link
              fedilink
              English
              arrow-up
              3
              ·
              2 months ago

              Idk, I’ve spent way more on games since Valve came to Linux. I was a Linux user first, and mostly played games on console because I didn’t like rebooting into Windows or fiddling w/ WINE, so if I played games, it’s because it had Linux support (got a ton through Humble Bundle when they were small and scrappy). When Steam came to Linux, I created an account (didn’t have one before) and bought a bunch of games. I bought Rocket League when the Steam Controller and Steam Deck launched (was part of a bundle), and when Proton launched, I bought a ton of Windows games.

              So at least for me, I’ve easily spent 100x what I would’ve spent on video games due to Steam supporting Linux. That said, there are easily 50 other people spending more than me on Windows for every one of me, so I get that Linux isn’t a huge target market. But I will spend more on an indie game if it has native Linux support.

      • Telorand@reddthat.com
        link
        fedilink
        English
        arrow-up
        16
        ·
        2 months ago

        And I’ll add on to that, even if every GPU company stops innovating, we’ll still have older cards and hardware to choose from, and the games industry isn’t going to target hardware nobody is buying (effectively pricing themselves out of the market). Indie devs especially tend to have lower hardware requirements for their games, so it’s not like anyone will run out of games to play.

    • dinckel@lemmy.world
      link
      fedilink
      English
      arrow-up
      40
      ·
      2 months ago

      Genuinely wish more people understood this. I’ve mostly only been playing indie games for the past few years. By far the best fun i’ve had in gaming. A ton of unbelievably creative, unique games out there. Not to mention that 99% of them are a single-purchase experience, instead of a cash treadmill

    • GBU_28@lemm.ee
      link
      fedilink
      English
      arrow-up
      20
      ·
      edit-2
      2 months ago

      Hello indie gamer, it’s me, you, from the future.

      I’d like to introduce you to PATIENT indie gaming.

      The only games I play are small team, longer running, well documented, developers are passionate, mods exist, can play on a potato or a steam deck, etc

      Because I’m patient, I don’t ever get preorder, Kickstarter, prealpha disappointed.

      I know exactly what I’m getting, I pay once, and boom, I own a great game for ever. (You can more often fully DL indie games)

        • GBU_28@lemm.ee
          link
          fedilink
          English
          arrow-up
          7
          ·
          2 months ago

          Bro I’m from the future you can’t ask me stuff like that, be patient, you’ll figure it out

    • EnderMB@lemmy.world
      link
      fedilink
      English
      arrow-up
      8
      ·
      edit-2
      2 months ago

      My only fear with the indie gaming industry is that many of them are starting to embrace the churn culture that has led AAA gaming down a dark path.

      I would love an app like Blind that allows developers on a game to anonymously call out the grinding culture of game development, alongside practices like firing before launch and removing credits from workers. Review games solely on how the dev treated the workers, and we might see some cool corrections between good games and good culture.

      • Telorand@reddthat.com
        link
        fedilink
        English
        arrow-up
        8
        ·
        2 months ago

        There’s certainly room to grow with regard to workers’ rights. I think you could probably solve at least a few of them if they were covered by a union, and publishers who hire them would have to bargain for good development contract terms.

    • RubberDuck@lemmy.world
      link
      fedilink
      English
      arrow-up
      5
      ·
      2 months ago

      Plenty of good games out there, even in the early access I have found some real gems. Just recently coffee stain released satisfactory… labor of love and it shows. I recently tried bellwright, it’s impressive, so is manor lords.

      And hardware stagnating also means that people get to learn what it’s all about and optimize for it. The last gen games on a console are usually also better optimized than the first series of games on a platform. So yeah…

    • scarabic@lemmy.world
      link
      fedilink
      English
      arrow-up
      6
      arrow-down
      2
      ·
      edit-2
      2 months ago

      Gaming now is more amazing that ever in part because we have access to classic games too. If someone thinks gaming was amazing 10 years ago, cool. We still have those games! I’m playing a really old game right now myself and loving it.

      I think OP confuses this whole bubble bursting thing. When a phenomenon passes out of its early explosive growth phase and settles into more of a steady state, that’s not the “bubble bursting” that’s maturity.

      Tech as a whole is now a more mature industry. Companies are expected to make money, not revolutionize the world. OP would have us believe this means that tech is over. How does the saying go? It’s not the beginning of the end, but it is perhaps the end of the beginning.

      • frezik@midwest.social
        link
        fedilink
        English
        arrow-up
        3
        ·
        2 months ago

        Companies are expected to make money, not revolutionize the world

        I’d like to believe that, but I don’t think investors have caught on yet. That’s where the day of reckoning will come.

        AI is a field that’s gone through boom and bust cycles before. The 1960s were a boom era for the field, and it largely came from DoD money via DARPA. This was awkward for a lot of the university pre and post grads in AI at the time, as they were often part of the anti-war movement. Then the anti-war movement starts to win and the public turns against the Vietnam war. This, in turn, causes that DARPA money to dry up, and it’s not replaced with anything from elsewhere in the government. This leads to an AI winter.

        Just to be clear, I like AI as a field of research. I don’t at all like what capitalism is doing with it. But what did we get from that time of huge AI investment? Some things that can be traced directly back to it are optimizing compilers, virtual memory, Unix, and virtual environments. Computing today would look entirely different without it. We may have eventually invented those things otherwise, but it would have taken much, much longer.

      • sugar_in_your_tea@sh.itjust.works
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 months ago

        I’m playing a really old game right now myself and loving it.

        Same. I’m slowly working my way through the Yakuza series (started w/ Yakuza 0), and I’m currently halfway through Yakuza 3, which was released in 2010. I play them about a year or two apart because I get kinda burned out near the end.

        I have way more games than I can reasonably play, and my wishlist of games I want to play is still unreasonably big. There’s no way I’m running out of interesting games to play anytime soon. And I haven’t really gotten into emulation either, so these are purely PC titles that I’m still trying to catch up on.

        Companies are expected to make money, not revolutionize the world

        Exactly. There’s a clear reason why Warren Buffett still owns a massive stake in Coca-Cola, and it’s not because they’re a hot young startup. Tech hardware is fantastic, and honestly, most people really don’t need big improvements year over year. I think game devs can do a lot more with the hardware we already have, so we should be looking at refining the HW we have (small improvements in performance, larger improvements in power efficiency and reduction in die size to improve margins). Likewise for desktop and cloud software, a round of optimizations would probably yield better gains than hardware revisions.

        I’m excited to see VR headsets get cheaper and more ubiquitous (i.e. I think something like the Valve Index could be done for half the price), handheld PCs like Steam Deck getting better battery life, etc.

    • Shadywack@lemmy.world
      link
      fedilink
      English
      arrow-up
      2
      ·
      2 months ago

      I love this, and I’ll even one up it. Let the bubbles burst, this is just a transitional period that you see like a predictable cycle in tech. The dot com burst was like a holocaust compared to this shit. Everyone who was in the tech scene before Google has an easier time with this. We can comfortable watch FAANG recede, and even be grateful for it. Let it happen.

  • frezik@midwest.social
    link
    fedilink
    English
    arrow-up
    55
    arrow-down
    1
    ·
    2 months ago

    . . . with 10% increase in performance rather than 50 or 60% like we really need

    Why is this a need? The constant push for better and better has not been healthy for humanity or the planet. Exponential growth was always going to hit a ceiling. The limit on Moore’s Law has been more to the economic side than actually packing transistors in.

    We still don’t have the capability to play games in full native 4K 144 Hertz. That’s at least a decade away

    Sure you can, today, and this is why:

    So many gaming companies are incapable of putting out a successful AAA title because . . .

    Regardless of the reasons, the AAA space is going to have to pull back. Which is perfectly fine by me, because their games are trash. Even the good ones are often filled with micro transaction nonsense. None of them have innovated anything in years; that’s all been done at the indie level. Which is where the real party is at.

    Would it be so bad if graphics were locked at the PS4 level? Comparable hardware can run some incredible games from 50 years of development. We’re not even close to innovating new types of games that can run on that. Planet X2 is a recent RTS game that runs on a Commodore 64. The genre didn’t really exist at the time, and the control scheme is a bit wonky, but it’s playable. If you can essentially backport a genre to the C64, what could we do with PS4 level hardware that we just haven’t thought of yet?

    Yeah, there will be worse graphics because of this. Meh. You’ll have native 4K/144Hz just by nature of pulling back on pushing GPUs. Even big games like Rocket League, LoL, and CS:GO have been doing this by not pushing graphics as far as they can go. Those games all look fine for what they’re trying to do.

    I want smaller games with worse graphics made by people who are paid more to work less, and I’m not kidding.

    • sugar_in_your_tea@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      15
      ·
      2 months ago

      None of them have innovated anything in years

      Well, they’ve innovated news ways to take up disk space…

      There’s a reason I don’t play new release AAA games, and it’s because they’re simply not worth the price. They’re buggy at launch, take up tons of disk space (with lots of updates the first few months), and honestly aren’t even that fun even when the bugs are fixed. Indie games, on the other hand, seem to release in a better state, tend to be fairly small, and usually add something innovative to the gameplay.

      The only reason to go AAA IMO is for fancy graphics (I honestly don’t care) and big media franchises (i.e. if you want Spiderman, you have to go to the license holder), and for me, those lose their novelty pretty quickly. The only games I buy near release time anymore are Nintendo titles and indie games from devs I like. AAA just isn’t worth thinking about, except the one or two each year that are actually decent (i.e. Baldur’s Gate 3).

    • Dkarma@lemmy.world
      link
      fedilink
      English
      arrow-up
      14
      ·
      edit-2
      2 months ago

      This post really nails my take on the issue. Give me original cs level graphics or even aq2 graphics, a decent story, more levels, and a few new little gimmicks (rocket arena grappling hook, anyone?!?!) and you don’t need 4k blah blah bullshit.

      The #1 game for kids is literally Minecraft or Roblox…8 bit level gfx outselling your horse armor hi res bullshit.

      The last game i bought was 2 days ago. Mohaa airborne for PC for $5 at a pawn shop Give me 100 of this quality of game instead of anything PS5 ever made.

      • solomon42069@lemmy.world
        link
        fedilink
        English
        arrow-up
        13
        ·
        edit-2
        2 months ago

        Here are the number of hours I’ve spent on indie games VS AAA titles, according to my Steam library:

        • Indie - Valheim - 435 hours
        • Indie - Space Haven - 332 hours
        • Indie - Satisfactory - 215 hours
        • Indie - Dyson Sphere Program - 203 hours
        • AAA - Skyrim - 98 hours
        • AAA - Control - 47 hours
        • AAA - Far Cry 6 - 29 hours
        • AAA - Max Payne 3 - 43 minutes

        If we’re talking about value - the amount of playtime I’ve gotten out of games with simpler graphics and unique ideas blows the billions spent by the industry out of the water.

        • sugar_in_your_tea@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          5
          ·
          2 months ago

          Depending on where you draw the line, mine looks similar:

          1. EU4 - >800 hours
          2. Cities Skylines - ~180 hours
          3. Magic: Arena - >100 hours
          4. Crusader Kings 2 - ~100 hours

          After that it depends on the length of the game. I normally just play through the campaign on most games once (except the above, which have lots of replay value), so looking at playtime isn’t particularly interesting IMO. The ratio of games with interesting playtime (i.e. I probably rolled credits) between indie and AAA is easily 2:1, if not something way higher like 5:1 or even 10:1, but again, that really depends on where you draw the line. If we look at 100% completion, I have 22 indie games and zero AAA games, because I rarely find AAA games to be worth going after achievements in. If I sort by achievement completion, the top two AAA games are Yakuza games (I love that series), and that’s after scrolling through dozens of indies, many of which have a fair amount of achievements (i.e. you need to do more than just roll credits).

          So yeah, AAA games really don’t interest me. If you compare the amount I’ve spent on indie vs AAA games, it would be a huge difference since I pretty much only play older AAA games if I get them on sale, and that’s mostly so I can talk about them w/ friends…

    • barsoap@lemm.ee
      link
      fedilink
      English
      arrow-up
      2
      ·
      edit-2
      2 months ago

      The limit on Moore’s Law has been more to the economic side than actually packing transistors in.

      The reason why those economic limits exist is because we’re reaching the limit of what’s physically possible. Fabs are still squeezing more transistors into less space, for now, but the cost per transistor hasn’t fallen for some time, IIRC about 10nm thereabouts is still the most economical node. Things just get difficult and exponentially fickle the smaller you get, and at some point there’s going to be a wall. Of note currently we’re talking more about things like backside power delivery than actually shrinking anything. Die-on-die packaging and stuff.

      Long story short: Node shrinks aren’t the low-hanging fruit any more. Haven’t been since the end of planar transistors (if it had been possible to just shrink back then they wouldn’t have engineered FinFETs) but it’s really been taking up speed with the start of the EUV era. Finer and finer pitches don’t really matter if you have to have more and more lithography/etching/coating steps because the structures you’re building are getting more and more involved in the z axis, every additional step costs additional machine time. On the upside, newer production lines could spit out older nodes at pretty much printing press speed.

    • gandalf_der_12te@lemmy.blahaj.zone
      link
      fedilink
      English
      arrow-up
      2
      ·
      2 months ago

      I want smaller games with worse graphics made by people who are paid more to work less, and I’m not kidding.

      I agree. Wholeheartedly. I think it’s just so obvious how quality dramatically takes off when the people creating it feel safe, sound, and economically stable. Financial Security (UBI) drives creativity probably more than anything else. It’s a huge win!

  • BananaTrifleViolin@lemmy.world
    link
    fedilink
    English
    arrow-up
    55
    arrow-down
    2
    ·
    2 months ago

    As others have said, gaming is thriving - AAA and bloated incumbants are not doing well but the indie sector is thriving.

    VR is not on the verge of collapse, but it is growing slowly as we still have not reached the right price point for a mobile high powered headset. Apple made a big play for the future of VR with its Apple Vision Pro but that was not a short term play; that was laying the ground works for trying to control or shape a market that is still probably at least 5 if not 10 years away from something that will provide high quality VR, untethefed from a. PC.

    AI meanwhile is a bubble. We are not in an age of AI, we are in an age of algorithms - they will and are useful but will not meet the hype or hyperbole being banded about. Expect that market to pop and probably with spectacular damage to some companies.

    Other computing hardware is not really stagnating - we are going through a generational transition period. AMD is pushing Zen 5 and Intel it’s 14th gen, and all the chip makers are desperately trying to get on the AI band wagon. People are not upgrading because they don’t see the need - there aren’t compelling software reasons to upgrade yet (AI is certainly not compelling consumers to buy new systems). They will emerge eventually.

    The lack of any landmark PC AAA games is likely holding back demand for consumer graphics cards, and we’re seeing similar issues with consoles. The games industry has certainly been here many times before. There is no Cyberpunk 2077 coming up - instead we’ve had flops like Star Wars Outlaws, or underperformers like Starfield. But look at the biggest game of last year - Baldurs Gate 3 came from a small studio and was a megahit.

    I don’t see doom and gloom, just the usual ups and downs of the tech industry. We happen to be in a transition period, and also being distracted by the AI bubble and people realising it is a crock of shit. But technology continues to progress.

    • sugar_in_your_tea@sh.itjust.works
      link
      fedilink
      English
      arrow-up
      13
      ·
      2 months ago

      VR

      Yeah, I think it’s ripe for an explosion, provided it gets more accessible. Right now, your options are:

      • pay out the nose for a great experience
      • buy into Meta’s ecosystem for a mediocre experience

      I’m unwilling to do either, so I’m sitting on the sidelines. If I can get a headset for <$500 that works well on my platform (Linux), I’ll get VR. In fact, I might buy 4 so I can play with my SO and kids. However, I’m not going to spend $2k just for myself. I’m guessing a lot of other people are the same way. If Microsoft or Sony makes VR accessible for console, we’ll probably see more interest on PC as well.

      People are not upgrading because they don’t see the need

      Exactly. I have a Ryzen 5600 and an RX 6650, and it basically plays anything I want to play. I also have a Steam Deck, and that’s still doing a great job. Yeah, I could upgrade things and get a little better everything, but I can play basically everything I care about (hint: not many recent AAA games in there) on reasonable settings on my 1440p display. My SO has basically the same setup, but with an RX 6700 XT.

      I’ll upgrade when either the hardware fails or I want to play a game that needs better hardware. But I don’t see that happening until the next round of consoles comes out.

      • realitista@lemm.ee
        link
        fedilink
        English
        arrow-up
        5
        ·
        2 months ago

        Yeah Sony was my hope here but despite a few great experiences, they have dropped the ball overall. I’m bored of the cartooney Quest stuff, so I’ll probably not buy another headset for a good 5-10 years until there’s something with a good library and something equivalent to a high end PC experience today.

        • sugar_in_your_tea@sh.itjust.works
          link
          fedilink
          English
          arrow-up
          3
          ·
          2 months ago

          Yup, but with good headsets costing way more than good monitors and generally needing even better GPUs, I’m just not interested. Yeah, the immersion is cool, but at current prices and with the current selection of games, the value proposition just isn’t there. Add to that the bulk, it’ll probably be on my wishlist for a while (then again, Bigscreen VR headset looks cool, just need a way to swap pads so my SO/kids can try it).

          So yeah, maybe in 5-10 years it’ll make more sense. It could also happen sooner if consoles really got behind it, because they’re great at bringing down entry costs.

          • realitista@lemm.ee
            link
            fedilink
            English
            arrow-up
            3
            ·
            2 months ago

            Unfortunately Sony was our last hope for consoles and they half assed it. The very last hope is that Flat2VR ports tens of AAA titles at a rapid procession to PS5.

  • madjo@feddit.nl
    link
    fedilink
    English
    arrow-up
    43
    arrow-down
    4
    ·
    2 months ago

    We still don’t have the capability to play games in full native 4K 144 Hertz.

    And we really don’t need that. Gameplay is still more important than game resolution. Most gamers don’t even have hardware that would allow that type of resolution.

    • XIIIesq@lemmy.world
      link
      fedilink
      English
      arrow-up
      25
      ·
      edit-2
      2 months ago

      I remember when running counter strike at 30fps on a 480p monitor meant you had a good computer.

      Modern graphics are amazing, but they’re simply not required to have a good gaming experience.

    • Buttflapper@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      12
      ·
      2 months ago

      Gameplay is still more important than game resolution

      In your opinion*. You forgot that part. For lots of people, graphics are way more important because they want a beautiful and immersive experience. They are not wrong to want that. I respect that you feel the way you do, but I respect others who care more about graphics. I’ll even go so far as to say that I am of the same mind as you, I don’t care about the graphics much at all but there are some games where the graphics have truly wowed me, or the visual effects. For example two that come to mind, Ori and the will of the wisps, or No Man’s sky. Two very different games but absolutely crazy visual effects and graphics on high-end computers. Another game that I play a lot is World of Warcraft, gameplay is so damn fun but it’s hard to get any of my friends to play it because it’s so ugly, looks like a poorly rendered PS3 game. That horrible quality of graphics prevents people from even trying it

      Most gamers don’t even have hardware that would allow that type of resolution.

      This is because they refuse to innovate. Think of the DVD player. You think a DVD player costs a lot today? Of course not, there’s a million of them and no one wants them anymore. If they actually innovated and created drastic leaps and technology, then older technology would be cheaper. It’s not expensive to go out and get an RTX 2080, which is the graphics card I currently have. Is about 250 or $300 now, pretty damn solid card. If they actually innovated and kept pushing the limits, technology would accelerate faster. Instead they want the inverse of that. They want as slow growth in technology as feasibly possible, maximum amount of time to innovate, maximum amount of revenue, and maximized impact on the environment. All those carbon emissions and waste of graphics cards being thrown out

      • HobbitFoot @thelemmy.club
        link
        fedilink
        English
        arrow-up
        5
        ·
        2 months ago

        If graphics were with it, people would pay for it.

        The fact of the matter is that exponential graphics capabilities requires an exponential input of developer and asset creator budget. Given that there is a ceiling on game prices, it isn’t worth it going for higher fidelity games when the market isn’t going to pay for it.

      • madjo@feddit.nl
        link
        fedilink
        English
        arrow-up
        4
        ·
        2 months ago

        You can have the most realistic graphics in the world, pushing your AMViditel RTX 5095Ti Plus Platinum Ultra with 64TB VRAM to it’s absolute maximum, but if the gameplay sucks, you won’t have as much fun as you would with a pixel art indie game with lots of fun gameplay.

  • j4k3@lemmy.world
    link
    fedilink
    English
    arrow-up
    34
    ·
    2 months ago

    Mainstream is about to collapse. The exploitation nonsense is faltering. Open source is emerging as the only legitimate player.

    Nvidia is just playing conservative because it was massively overvalued by the market. The GPU use for AI is a stopover hack until hardware can be developed from scratch. The real life cycle of hardware is 10 years from initial idea to first consumer availability. The issue with the CPU in AI is quite simple. It will be solved in a future iteration, and this means the GPU will get relegated back to graphics or it might even become redundant entirely. Once upon a time the CPU needed a math coprocessor to handle floating point precision. That experiment failed. It proved that a general monolithic solution is far more successful. No data center operator wants two types of processors for dedicated workloads when one type can accomplish nearly the same task. The CPU must be restructured for a wider bandwidth memory cache. This will likely require slower thread speeds overall, but it is the most likely solution in the long term. Solving this issue is likely to accompany more threading parallelism and therefore has the potential to render the GPU redundant in favor of a broader range of CPU scaling.

    Human persistence of vision is not capable of matching higher speeds that are ultimately only marketing. The hardware will likely never support this stuff because no billionaire is putting up the funding to back up the marketing with tangible hardware investments. … IMO.

    Neo Feudalism is well worth abandoning. Most of us are entirely uninterested in this business model. I have zero faith in the present market. I have AAA capable hardware for AI. I play and mod open source games. I could easily be a customer in this space, but there are no game manufacturers. I do not make compromises in ownership. If I buy a product, my terms of purchase are full ownership with no strings attached whatsoever. I don’t care about what everyone else does. I am not for sale and I will not sell myself for anyone’s legalise nonsense or pay ownership costs to rent from some neo feudal overlord.

    • Chocrates@lemmy.world
      link
      fedilink
      English
      arrow-up
      19
      ·
      2 months ago

      Mainstream is about to collapse. The exploitation nonsense is faltering. Open source is emerging as the only legitimate player.

      I’m a die hard open source fan but that still feels like a stretch. I remember 10 years ago we were theorizing that windows would get out of the os business and just be a shell over a unix kernel, and that never made it anywhere.

      • Rob Bos@lemmy.ca
        link
        fedilink
        English
        arrow-up
        4
        ·
        2 months ago

        I don’t think that is necessarily out of the running yet. OS development is expensive and low profit. Commodification may be inevitable. Control of the shell and GUI, where they can push advertisements and shovelware and telemetry on you, that is profitable.

        So in 20 years, 50? I predict proprietary OSes will die out eventually, balance of probability.

        • Chocrates@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          2 months ago

          I’m with you in the long term.

          I am curious what kernel is backing the computers on the stuff SpaceX is doing. I’ve never seen their consoles but I am guessing we are closer to modern reusable hardware and software than we were before. When niche applications like that keep getting more diverse, i bet we will get more open specifications so everything can work together.
          But again I am more pessimistic and think 50 years would be relatively early for something like that.

      • solomon42069@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 months ago

        I think the games industry will start to use open source tools like Blender and Godot more and more. These options have really matured over the years and compete on features and productivity with commercial options.

        From a business POV - open source makes a lot of sense when you need a guarantee your investment won’t evaporate because a vendor has cancelled a feature or API your game uses. With open source, if you don’t like a path the upstream code is taking you can fork off and make your own!

        Part of the dynamic is also how people are inspired and learning skills. You can learn how to do very advanced stuff in Blender for free on Youtube - why would you pay some private college thousands of dollars to learn an expensive program like Maya to do the same thing?

      • cakeistheanswer@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 months ago

        That’s probably closer today than it was then. The added complication being that client is probably not thin enough for them to return to mainframe model which would be vastly easier to monetize.

        Besides we got WSL out of the bargain, so at least inter op isn’t a reverse engineering job. Its poetically the reason linux ended up killing the last few win sever shops I knew. Why bother running win sever x just to run apache under linux. Why bother with hyper v when you can pull a whole docker image.

        If the fortune 500 execs are sold on microsoft ita mostly as a complicated contactual absolution of cyber security blame.

      • rottingleaf@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 months ago

        It remained in the OS business to the extent that is required for the malware business.

        Also NT is not a bad OS (except for being closed, proprietary and probably messy by now). The Windows subsystem over it would suck just as bad if it would run on something Unix.

        • Chocrates@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 months ago

          Yeah, I guess in my fantasy I was Assuming that windows would do a full rewrote and adopt the unix abi, but I know that wouldn’t happen.

          • rottingleaf@lemmy.world
            link
            fedilink
            English
            arrow-up
            1
            ·
            2 months ago

            They have a few legacy things working in their favor. Hardware compatibility is one, but seems to be a thing of the past now when people don’t care. Application compatibility is another, and that is with Windows, not with NT.

            And they don’t have to change the core parts, because NT is fine. Windows is not, it’s a heap of legacy, but it’s not realistically replaceable.

            Unless they develop from scratch a new subsystem, like Embrasures or Walls or Bars, and gradually deprecate Windows. Doesn’t seem very realistic too, but if they still were a software company and not a malware company, they’d probably start doing this sometime about now.

    • tias@discuss.tchncs.de
      link
      fedilink
      English
      arrow-up
      6
      ·
      2 months ago

      AI still needs a lot of parallelism but has low latency requirements. That makes it ideal for a large expansion card instead of putting it directly on the CPU die.

      • j4k3@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        ·
        2 months ago

        Multi threading is parallelism and is poised to scale to a similar factor, the primary issue is simply getting tensors in and out of the ALU. Good enough is the engineering game. Having massive chunks of silicon laying around without use are a mach more serious problem. At the present, the choke point is not the parallelism of the math but actually the L2 to L1 bus width and cycle timing. The ALU can handle the issue. The AVX instruction set is capable of loading 512 bit wide words in a single instruction, the problem is just getting these in and out in larger volume.

        I speculate that the only reason this has not been done already is because pretty much because of the marketability of single thread speeds. Present thread speeds are insane and well into the radio realm of black magic bearded nude virgins wizardry. I don’t think it is possible to make these bus widths wider and maintain the thread speeds because it has too many LCR consequences. I mean, at around 5 GHz the concept of wire connections and gaps as insulators is a fallacy when capacitive coupling can make connections across all small gaps.

        Personally, I think this is a problem that will take on a whole new architectural solution. It is anyone’s game unlike any other time since the late 1970’s. It will likely be the beginning of the real RISC-V age and the death of x86. We are presently at the age of the 20+ thread CPU. If a redesign can make a 50-500 logical core CPU slower for single thread speeds but capable of all workloads, I think it will dominate easily. Choosing the appropriate CPU model will become much more relevant.

    • sunzu2@thebrainbin.org
      link
      fedilink
      arrow-up
      6
      arrow-down
      3
      ·
      2 months ago

      I do not make compromises in ownership.

      preach!

      At the end of the day though proper change will only come once the critical mass aligns on this issues along few others.

      Political process is too captured for peasant to affect any change, we have more power voting with our money as customers, at least for now.

  • Blackmist@feddit.uk
    link
    fedilink
    English
    arrow-up
    27
    ·
    2 months ago

    COVID also inflated a lot of tech stock massively, as everybody suddenly had to rely a lot more on it to get anything done, and the only thing you could do for entertainment was gaming, streaming movies, or industrial quantities of drugs.

    Then that ended, and they all wanted to hold onto that “value”.

    It is a bubble, but whether it pops massively like in 2000, or just evens off to the point where everything else catches up, remains to be seen.

    “The markets can remain irrational longer than you can remain solvent” are wise words for anyone thinking of shorting this kind of thing.

    • Buttflapper@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      4
      arrow-down
      2
      ·
      2 months ago

      Shows that You are in the UK. Just want to clarify I’m talking specifically about the USA but I agree with everything you said. Tech stocks became so inflated! Don’t know if people are seeing it in Europe, but here in the USA, there is this really toxic and very cringe behavior from these tech companies to get people back to office, they can force people to return to office across the country, basically you have to relocate and upend your entire life which could cost you $50,000 and they’re not paying for that, if you don’t do that you get fired. Easy way to start laying off people without having to pay them anything because you can call it insubordination, since they refuse to return to office. Now they supposedly have cause to get rid of people or deny them promotions for more money. IBM for example is doing this right now, Cisco was doing it as well. One of the most major networking software companies in the market. Scumbag behavior

  • schizo@forum.uncomfortable.business
    link
    fedilink
    English
    arrow-up
    25
    arrow-down
    1
    ·
    2 months ago

    Well, that’s the doomer take.

    The rumors are that the 80 series card is 10% faster than the 90 series card from last gen: that’s not a ‘10%’ improvement, assuming the prices are the same, that’s more like a 40% improvement. I think a LOT of people don’t realize how shitty the 4080 was compared to the 4090 and are vastly mis-valuing that rumor.

    I’d also argue the ‘GAMES MUST BE ULTRA AT 4K144 OR DONT BOTHER’ take is wrong. My gaming has moved almost entirely to my Rog Ally and you know what? Shit is just as fun and way more convenient than the 7700x/3080 12gb desktop even if it’s 1080p low and not 1440p120. If the only thing the game has going for it is ‘ooh it’s pretty’ then it’s unlikely to be one of those games people care about in six months.

    And anyways, who gives a crap about AAAAAAAAAAAAA games? Indie games are rocking it in every genre you could care to mention, and the higher budget stuff like BG 3 is, well, probably the best RPG since FO:NV (fight me!).

    And yes, VR is in a shitty place because nobody gives a crap about it. I’ve got a Rift, Rift S, Quest, and a Quest 2 and you know what? It’s not interesting. It’s a fun toy that, but it has zero sticking power and that’s frankly due to two things:

    1. It’s not a social experience at all.
    2. There’s no budget for the kind of games that would drive adoption, because there’s no adoption to justify spending money on a VR version.

    If you could justify spending the kind of money that would lead to having a cool VR experience, then yeah, it might be more compelling but that’s been tried and nobody bought anything. Will say that Beat Saber is great, but one stellar experience will not sell anyone on anything.

    And AI is this year’s crypto which was last year’s whatever and it’s bubbles and VC scams all the way down and pretty much always has been. Tech hops from thing to thing that they go all in on because they can hype it and cash out. Good for them, and be skeptical of shit, but if it sticks it sticks, and if it doesn’t it doesn’t.

    • Trainguyrom@reddthat.com
      link
      fedilink
      English
      arrow-up
      4
      ·
      2 months ago

      I’d also argue the ‘GAMES MUST BE ULTRA AT 4K144 OR DONT BOTHER’ take is wrong.

      Some of the best games I’ve played have graphics that’ll run on a midrange GPU from a decade ago, if not just integrated graphics

      Case in point, this is what I’m playing right now:

    • fuckwit_mcbumcrumble@lemmy.dbzer0.com
      link
      fedilink
      English
      arrow-up
      2
      ·
      2 months ago

      The 5080 is rumored to be 10% faster, but also use 90% the power. While performance has a normal generational leap, power consumption has gone up to match leaving you with a much smaller actual improvement.

      • schizo@forum.uncomfortable.business
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 months ago

        Power consumption numbers like that are expected, though.

        One thing to keep in mind is how big the die is and how many transistors are in a GPU.

        As a direct-ish comparison, there’s about 25 billion transistors in a 14900k, and 76 billion in a 4090.

        Big die + lots and lots of transistors = bigly power usage.

        I wouldn’t imagine that the 5000-series GPUs are going to be smaller or have less transistors, so I’d expect this to be in the die shrink lowers power usage, but more transistors increase power usage zone.

        • fuckwit_mcbumcrumble@lemmy.dbzer0.com
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 months ago

          You can also get big power consumption from turning up the voltage and cranking the clock speeds well past their efficient zone. You see that right now with most 40 series cards where turning the clock speeds down a smidge gives you huge power savings at almost no loss in performance.

          Cost per MM^2 of die space has only gone up with each process node these last 10 years, so unless you’re paying big money don’t expect big chip.

        • Vik@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          2 months ago

          Conversly, the apple silicon products ship huge, expensive dies fabbed on leading TSMC processes which sip power relative to contemporaries. You can have excellent power efficiency on a large die at a specific frequency range, moreso than a smaller die clocked more aggressively.

          • schizo@forum.uncomfortable.business
            link
            fedilink
            English
            arrow-up
            1
            ·
            2 months ago

            You’re not wrong (and those are freaking enormous dies that have to cost apple a goddamn fortune to make at scale), but like, it also isn’t an Apples-to-Apples comparison.

            nVidia/Intel/AMD have gone for the maximum performance and fuck any heat/noise/power usage path. They haven’t given a shit about low-power optimizations or investing in designs that are more suited to low-power usage (a M3 max will pull ~80w if you flog the crap out of it, so let’s use that number) implementations. IMO the wrong choice, but I’m just a computer janitor that uses the things, I don’t design them.

            Apple picked a uarch that was already low power (fun fact: ARM was so low power that the first test chips would run off the board’s standby power and would boot BEFORE they were actually turned on) and then focused in on making it as fast as possible with the least power as possible: the compute cores have come from the mobile side prior to being turned into desktop chips.

            I’m rambling but: until nVidia and x86 vendors prioritize power usage over raw performance (which they did with zen5 and you saw how that shit spiraled into a fucking PR shit mess) then you’re going to get next year’s die shrink, but with more transistors using the same power with slightly better performance. It’s entirely down to design decisions, and frankly, x86 (and to some degree so has nVidia) have painted themselves into a corner by relying on process node improvements (which are very rapidly going to stop happening) and modest IPC uplifts to stay ahead of everyone else.

            I’m hoping Qualcomm does a good job staying competitive with their ARM stuff, but it’s also Qualcomm and rooting for them feels like cheering on cancer.

            • Vik@lemmy.world
              link
              fedilink
              English
              arrow-up
              2
              ·
              edit-2
              2 months ago

              This outlines several issues, a key one is outbidding apple for wafer alloc on leading processes. They primarily sell such high margin products that I suppose they can go full send on huge dies with no sweat. Similarly, the 4090’s asking price was likely directly related to it’s production cost. A chunky boy with a huge l2$.

              I like the way Mike Clark frames challenges in semi eng as a balancing act between area, power, freq and performance (IPC); like a chip that’s twice as fast but twice the size of its predecessor is not considered progress.

              I wish ultra-efficient giga dies were more feasible but it’s kind of rough when TSMC has been unmatched for so long. I gather Intel’s diverting focus in 18A, and I hope that turns out well for them.

              I’m not sure that arm as an ISA (or even RISC) is inherently more efficient than CISC today, particularly when we look at Qualcomm’s latest efforts in notebooks, more that Apple have extremely proficient designers and benefit from vertical integration.

    • astropenguin5@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 months ago

      Little bit of pushback on the vr front: Sure, there aren’t many massive publishers driving it forward, but I would wholeheartedly argue that it can very much be a social experience, and offers experiences it is damn near impossible to get anywhere else, and three games immediately come to mind:

      VRchat (obviously): Literally entirely a social game, and has a pretty large community of people making things for it, from character models to worlds because that’s what drives the game. There is a massive scene of online parties, raves, hangouts, etc. that bring people together across the whole world in a medium more real than any flat game because of the custom models, worlds, and the relative abundance of people using full body tracking to show off, dance, and interact with each other.

      VTOL VR: This is still fairly social in that you can either play with friends or people online, but the main draw for me is the level of immersion in flying you can get. You have full interactable cockpits that you basically just use your real hands to interact with (depending on your controller/hand tracking) and it’s all pretty realistic. It’s just impossible to have the same level of experience without VR.

      Walkabout mini golf: I was pretty skeptical of this game when my friends wanted to play it, it’s literally just a mini golf sim. The thing is, the ability to play mini golf with friends who live across the country/world is amazing, and the physics of just swinging your controller/hands in the same way as real mini golf is so special.

      It is still quite expensive to get really good gear, and that is definitely the current biggest hurdle. It may forever be a smaller community due to the space/tech/cost requirements to make the experience truly incredible, but for me even just on a quest 2 in my room without a lot of fancy stuff, it is still interesting and something special. A lot of people really do care a lot about VR, and even if it is far less than conventional gaming, it should not be entirely discounted. And I personally think that while is probably won’t ever replace flat screen gaming, it is an entirely different kind of experience and has a at least decent future ahead

      • schizo@forum.uncomfortable.business
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 months ago

        Fair points on VR games being fairly social. I was more thinking of the in-person social experience, which is still involving some portion of people sitting around stuffing their face into a headset and wandering off into their own world.

        IMO, this is something that AR/MR stuff could do a great job of making more social by adding the game to the world, rather than taking the person out of the world to the game but, of course, this also restricts what kind of games you can do so is probably only a partial solution and/or improvement on the current state of affairs.

        I also agree that it’s way too expensive still, and probably always will be because the market is, as you mentioned, small.

        PCVR is pretty much dead despite its proponents running around declaring that it’s just fine like it’s a Monty Python skit. And the tech for truly untethered headsets is really only owned by a single (awful) company and only because the god-CEO thinks it’s a fun thing to dump money on which means it’s subject to sudden death if he retires/dies/is ousted/has to take time off to molt/has enough shareholder pressure put on him.

        Even then, it’s only on a second generation (the original Quest was… beta, at best) and is expensive enough that you have to really have a reason to be interested rather than it being something you could just add to your gaming options.

        I’d like VR to take off and the experiences to more resemble some of the sci-fi worlds that have a or take place in a virtual reality world, but honestly, I’ve thought that would be cool for like 20 years now and we’re only very slightly closer than we were then, we just have smaller headsets and somewhat improved graphics.

    • Telorand@reddthat.com
      link
      fedilink
      English
      arrow-up
      13
      ·
      2 months ago

      If only we had some way of working with a bigger integer…maybe we’d call it something like BigInteger…

      • bamboo@lemm.ee
        link
        fedilink
        English
        arrow-up
        2
        ·
        2 months ago

        Or just a u64. 64 bit computers are pretty standard nowadays.

        • Telorand@reddthat.com
          link
          fedilink
          English
          arrow-up
          1
          ·
          2 months ago

          I had heard that. Maybe I’ll get my hands on one someday. I hear Commodore makes one.

          (I do wonder now if whatever variable is being used to denote time is signed or unsigned, because that would make a big difference, too.)

          • bamboo@lemm.ee
            link
            fedilink
            English
            arrow-up
            1
            ·
            2 months ago

            The C64 is 8 bit but has 64k of memory.

            While the specification allows time_t to be basically whatever, in practice it’s a signed 32 bit int. Presumably to accommodate whatever came theoretically before the world was created on 1/1/1970.

  • LordCrom@lemmy.world
    link
    fedilink
    English
    arrow-up
    16
    ·
    2 months ago

    I would love to have a VR headset that didn’t require a damn account with a 3rd party just to use it. I don’t need an account for my monitor or my mouse. Plus when I bought the thing, it was just Oculus, then meta bought it and promised nothing would change, before requiring a meta account to use the fucking thing.

    • Buttflapper@lemmy.worldOP
      link
      fedilink
      English
      arrow-up
      2
      ·
      2 months ago

      That unfortunately is the consequence of letting a company have a monopoly. The US govt should’ve opposed that, and should’ve forced them to sell it. They own such a huge share of the entire VR market right now it’s unbelievable, and Pico by byte dance isn’t legally able to be sold on the USA

    • capital@lemmy.world
      link
      fedilink
      English
      arrow-up
      1
      ·
      2 months ago

      If I get back into it, I’ll probably try out Bigscreen. I haven’t dug deep enough into it to know if it requires an account but I wouldn’t expect this one to require it.

  • magic_lobster_party@fedia.io
    link
    fedilink
    arrow-up
    16
    ·
    2 months ago

    What’s happening is that support from VC money is drying up. Tech companies have for a long time survived on the promise that they will eventually be much more profitable in the future. It doesn’t matter if it’s not profitable today. They will be in the future.

    Now we’re in a period where there’s more pressure on tech companies to be profitable today. That’s why they’re going for such anti consumer behaviors. They want to make more with less.

    I’m not sure if there’s a bubble bursting. It could just be a plateau.

    • XIIIesq@lemmy.world
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      1
      ·
      2 months ago

      I agree. Smartphones, for example, have hardly changed at all over the last ten years, but you don’t see Apple and Samsung going out of business.

      • AA5B@lemmy.world
        link
        fedilink
        English
        arrow-up
        3
        ·
        2 months ago

        I understand you don’t appreciate where we’ve come from and how fast, can’t see the year to years changes, but the iPhone is just a little over ten years old. Do you really not see huge changes between an early iPhone and today’s?

        • XIIIesq@lemmy.world
          link
          fedilink
          English
          arrow-up
          2
          ·
          edit-2
          2 months ago

          On the contrary, I absolutely appreciate it. I was about 15 when mobile phones first became a thing that everyone owned, so I’ve lived through the entire progression from when they were something only a well to do businessman would have all the way through to today. The first iPhone was 2007, 17 years ago btw.

          When mobile phones became popular, each new generation of phones saw HUGE improvements and innovation. However, the last ten years has pretty much just been slight improvements to screen/camera/memory/CPU. Form wise and functionally, they’re very similar to the phone of ten years ago.

          I understand that some technophiles will always be able to justify why the new iPhone is worth £1600 and if that’s what they want to spend their money on then good for them, but I personally think that they are kidding themselves. Today you can get a brilliant phone for £300 or even less.

          • AA5B@lemmy.world
            link
            fedilink
            English
            arrow-up
            2
            ·
            edit-2
            2 months ago

            I’d never justify that urge to spend ridiculous money updating every year to the latest and greatest, but people tend to under appreciate the massive improvements from accumulated incremental improvements.

            OLED screen on my iPhone X was revolutionary (and I’m sure Android had it first), as just one example, and now most phones are. Personally I find ultrawideband and “find my” very innovative and well implemented. Or if that’s too small a change, how about the entire revolution of Apple designing their own SoC for every new model. There’s emergency satellite texting, fall/crash detection, even Apple mostly solving phone theft is innovative (even if you don’t like their approach)

            When we see steady improvements, humans tend to under-appreciate how it adds up

            • XIIIesq@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              2 months ago

              I’m not going to argue that there has been no progress, just that it’s not on the same scale.

              Look at the difference between phones from 2004 to 2014, then from 2014 to 2024 and surely you’d have to agree. We’re looking at huge leaps in tech and innovation Vs much smaller incremental improvements.

              And I’d once again like to state that this is not a complaint, just a point of view showing that astonishing amounts of technological innovation are not necessarily required to keep companies in business.

          • Buttflapper@lemmy.worldOP
            link
            fedilink
            English
            arrow-up
            1
            ·
            2 months ago

            Damn that’s wild. Any business that has that drastic of spikes of profit and loss cannot possibly be sustainable. I can’t see how it could be. Look at the automobile giants in the USA. All it took was one major economic event to bankrupt them, and they got bailed out which should’ve never happened. It’s bullshit.

            • barsoap@lemm.ee
              link
              fedilink
              English
              arrow-up
              1
              ·
              2 months ago

              Memory chips have had an utterly fickle market ever since there’s been memory chips, companies in that business are still in that business because they learned how to deal with the swings. If micron can survive (and they will) then so will Samsung whose memory chip business has the whole conglomerate to fall back onto.

            • XIIIesq@lemmy.world
              link
              fedilink
              English
              arrow-up
              1
              ·
              edit-2
              2 months ago

              Yh, I’m not for bailing out companies that are “too big to fail”, I see it as socialism for the rich and capitalism for the poor, but that’s a separate debate.

              Tech stocks were a interesting case as they bloated far beyond their actual value during COVID, what happened in 2023 was probably somewhat of a renormalization and now they’re back to business as usual. There will always be peaks and valleys, but I’d be very surprised to see tech stocks fail in the long term.

      • barsoap@lemm.ee
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 months ago

        And it would be so easy to make a big splash in the market by having a phone where the camera doesn’t protrude out of the back.

        • XIIIesq@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          2 months ago

          To be fair, some phones already have that but they have much lower spec cameras/lenses, so it’s currently a trade off.

          If a flag ship phone were to find away to implement a flush top spec camera, it would still only be an incremental improvement rather than a great new technology or a substantial innovation.

  • solomon42069@lemmy.world
    link
    fedilink
    English
    arrow-up
    15
    ·
    edit-2
    2 months ago

    My biggest gripe with big tech is how governments of the world encourage their worst behaviours. Governments and businesses have failed to maintain their own level of expertise and understanding of technology.

    Today everything relies on tech but all the solutions are outsourced and rely on “guidance” and free hand outs from vendors like Microsoft. This has caused situations where billions are poured into digital transformation efforts with fuck all to show for it but administrative headaches, ballooning costs and security breaches.

    I’m so tired of silicon valley frat boys being the leaders of our industry. We need to go back to an engineer and ideas led industry. Focused on solving problems and making lives better. Not making bullshit unsustainable business monopolies with a huge pile of money. Right now big tech is the embodiment of all of capitalisms worst qualities.

    P.s. apologies if my comment is a bit simplistic and vague. didn’t want to write a 10 page rant but still wanted to say my 2c about the state of things.

  • sukotai@lemmy.world
    link
    fedilink
    English
    arrow-up
    14
    ·
    2 months ago

    it’s time for you to play PACMAN, as i did when i was young 😂
    no AI, no GPU, no shitcoin: you just have to eat ghost, which is very strange in fact when you think about it 🤪

    • emax_gomax@lemmy.world
      link
      fedilink
      English
      arrow-up
      12
      ·
      2 months ago

      Correction the ghosts are AI and based on how many times they killed me clearly a step above anything mainstream today (º ロ º๑).

  • Lvxferre@mander.xyz
    link
    fedilink
    English
    arrow-up
    12
    ·
    edit-2
    2 months ago

    It’s interesting how interconnected those points are.

    Generative A"I" drives GPU prices up. NVidia now cares more about it than about graphics. AMD feels no pressure to improve GPUs.

    Stagnant hardware means that game studios, who used to rely on “our game currently runs like shit but future hardware will handle it” and similar assumptions get wrecked. And gen A"I" hits them directly due to FOMO + corporates buying trends without understanding how the underlying tech works, so wasting talent by firing people under the hopes that A"I" can replace it.

    Large game companies are also suffering due to their investment on the mobile market. A good example of is Ishihara; sure, Nintendo simply ignored his views on phones replacing consoles, but how many game company CEOs thought the same and rolled with it?

    I’m predicting that everything will go down once it becomes common knowledge that LLMs and diffusion models are 20% actual usage, 80% bubble.

    • bad_news@lemmy.billiam.net
      link
      fedilink
      English
      arrow-up
      7
      arrow-down
      2
      ·
      2 months ago

      The backlash to this is going to be fun. Having lived through the .com boom/bust, which wasn’t a scam, the web was actually the future and was undersold if anything, no one with the stink of computer on them outside of a tiny elite could get decent fulltime work for like 5 years. AI is a scam, full stop. It has virtually no non-fraud real world applications that don’t reflect the underlying uselessness of the activity it can do. People are going to go full Butlerian Jihad from Dune when this blows up the economy, and it’s going to suck so much more for everyone in tech, scammer or no…

      • Lvxferre@mander.xyz
        link
        fedilink
        English
        arrow-up
        6
        ·
        2 months ago

        The backlash to this is going to be fun.

        In some cases it’s already happening - since the bubble forces AI-invested corporations to shove it down everywhere. Cue to Microsoft Recall, and the outrage against it.

        It has virtually no non-fraud real world applications that don’t reflect the underlying uselessness of the activity it can do.

        It is not completely useless but it’s oversold as fuck. Like selling you a bicycle with the claim that you can go to the Moon with it, plus a “trust me = be gullible, eventually bikes will reach Mars!” A bike is still useful, even if they’re building a scam around it.

        Here’s three practical examples:

        1. I use ChatGPT as a translation aid. Mostly to list potential translations for a specific word, or as conjugation/declension table. Also as a second layer of spell-proofing. I can’t use it to translate full texts without it shitting its own virtual pants - it inserts extraneous info, repeats sentences, removes key details from the text, butcher the tone, etc.
        2. I was looking for papers concerning a very specific topic, and got a huge pile (~150) of them. Too much text to read on my own. So I used the titles to pre-select a few of them into a “must check” pile, then asked Gemini to provide me three paragraphs summaries for the rest. A few of them were useful; without Gemini I’d probably have missed them.
        3. [Note: reported use.] I’ve seen programmers claiming that they do something similar to #1, with code instead. Basically asking Copilot how a function works, or to write extremely simple code (if you ask it to generate complex code it starts lying/assuming/making up non-existent libraries).

        None of those activities is underlyingly useless; but they have some common grounds - they don’t require you to trust the output of the bot at all. It’s either things that you wouldn’t use otherwise (#2) or things that you can reliably say “yup, that’s bullshit” (#1, #3).

        • bad_news@lemmy.billiam.net
          link
          fedilink
          English
          arrow-up
          5
          arrow-down
          1
          ·
          2 months ago

          I would argue all of these things were possible sans “AI” (although it would have been sold as AI in the 90’s) via existing heuristics adequately developed if people knew that was the desired application

          • Lvxferre@mander.xyz
            link
            fedilink
            English
            arrow-up
            2
            ·
            2 months ago

            They probably could, indeed - but you’d need multiple different applications, each for one use case. In the meantime a LLM offers you a tool that won’t hit all the nails, or screw all the screws, but does both decently enough in the lack of both a hammer and a screwdriver.

      • jacksilver@lemmy.world
        link
        fedilink
        English
        arrow-up
        1
        ·
        2 months ago

        I mean I can list a lot of things AI (and I’ll limit it to Transformers, the advancement that drives LLMs) has enabled:

        • Audio Transcription
        • Greatly improved language translation
        • Improved computer vision, in some situations
        • Combined with diffusion models has enabled targeted image generation, which we already know is being used in media and ads.

        AI isn’t a scam, but it’s being oversold and it’s limitations are being purposefully hidden. That being said, it is changing how things are done and that’s not going to stop. We’re still seeing impacts from CNNs, one of the major AI/ML breakthroughs from over a decade ago, make impacts.

  • tee9000@lemmy.world
    link
    fedilink
    English
    arrow-up
    16
    arrow-down
    4
    ·
    edit-2
    2 months ago

    I really truly suggest diversifying to newsfeeds without comment sections like techmeme for a bit.

    Increasing complexity is overwhelming and theres plenty of bad shit going on but theres a lot overblown in your post.

    Sorry for the long edit: i personally felt improvement for my mental health when i did this for 6 months or so. Because seriously, whatever disinformation is happening in american news is so exhausting. We need to think whatever we want and then engage with each other when our thoughts are more individualized. Dont be afraid to ask questions that might seem like you are questioning some holy established lemmy/reddit consensus. If you are being honest about your opinions and arent afraid to look dumb then you are doing the internet a HUGE service. We need more dumb questions and vulnerability to combat the obsession of appearing as an expert. So thank you for making a genuine post of concern.

  • Dead_or_Alive@lemmy.world
    link
    fedilink
    English
    arrow-up
    13
    arrow-down
    1
    ·
    2 months ago

    The pace of technological change and innovation was always going to slow down this decade. But Covid, Ukraine and a decoupling from Russia/China has further slowed it.

    You need three things in abundance to create tech. First an advanced economy, which narrows down most of the world. Second you need lots of capital to burn while you make said advances. Finally you need lots of 20 and thirty something’s who will invent and develop the tech.

    For the last 20 years we’ve had all of those conditions in the Western world. Boomers were at the height of their earnings potential and their kids were leaving home in droves letting them pour money into investments. Low interest rates abound because capital was looking for places to be utilized. China was the workshop of the world building low to mid range stuff allowing the West to focus its excess Millennials age workforce on value added and tech work.

    Now in the USA boomers are retiring and there aren’t enough GenX to make up the difference. Millennials and finally getting down to household creation or their oldest cohorts (Xennials) just now entering into their mid 40s and starting to move up in their careers but they probably still have kids to support. So it will be some time before capital becomes plentiful again. Gen Z is large but they aren’t enough to back fill the loss of Millennials.

    Ohh I made a point to highlight that this was a US demographic phenomena. Europe and Japan do not have a large Millennial or GenZ populations to replace their aging boomers. We have no modern economic model to map out what will happen to them.

    China is going through a demographic collapse worse than what you see in Europe or Japan. Only they aren’t rich to compensate add in the fact that they decided to antagonize their largest trading partners in the West causing the decoupling we are now seeing.

    The loss of their labor means the West has to reshore or find alternative low wage markets for production and expend a lot of capital to build out the plant in those markets to do so.

    Add on top geopolitical instability of the Ukraine and you have a recipe for slower tech growth.