• 126 Posts
  • 4.32K Comments
Joined 1 year ago
cake
Cake day: October 4th, 2023

help-circle
  • business writing

    I don’t know if this is still a problem, but I remember reading that some decades back, a number of companies had problems with people writing absolutely unusable emails.

    The problem, as I recall it being presented, was that historically the norm had that you’d have a secretary take dictation. That secretary was basically a professional writer, and would clean up all the memos and whatever that went out.

    But at some point, companies generally decided that people should just be emailing each other directly. Now you weren’t dictating to a secretary. You were typing an email yourself. The problem is that this meant that there were suddenly a lot of people who had relied on secretaries to clean things up for many years who had had no practice and were suddenly writing their own material…and it was horrendous.

    I’d guess that that was probably some twenty years ago now, at least, so maybe the problem has aged out.



  • Some of this may have changed, but relative to when I went to school in the US?

    Primary education

    • I’d remove cursive if it’s still being taught. I’ve read some articles saying that it still was in Canada, don’t know about the US. It has very limited uses – it’s optimized for being faster for writing a lot of text than printing, but if I’m going to be doing a lot of text, I’m going to be typing, not writing it longhand.

      kagis

      https://www.livenowfox.com/news/us-states-require-cursive-handwriting-students

      California and New Hampshire became the most recent states to pass legislation making cursive handwriting instruction mandatory. At least 25 other states require a similar form of instruction in schools, and another five states have legislation pending, according to data tracked by the American Handwriting Analysis Foundation.

      Sounds like it’s still in the curriculum.

    • On that note, typing. We did have minimal typing, but that is important, and I was still hunting-and-pecking until sometime in secondary education when I forced myself to switch over to touch typing.

    • I’d kill arts and crafts. Very few of the things that I actually did were things that were likely to be practical to build on. I’m dubious as to traditional media graphics being a core part of any curriculum – later in life, someone is a lot more likely to be professionally doing graphic arts on a computer.

    • Start math earlier. I was in two different school systems, and one pushed math harder and earlier. The kids there did much better on mathematics topics.

    • I have no idea what the state of computer application education is today, and I assume that it’s changed. Back when I was in primary school, there were too few computers available to teach the stuff, and we had very brief coverage in secondary education. I would hope that at this point, kids in primary education get some kind of coverage of text editing – I don’t know about word processing, which was kind of tied to paper documents, which are certainly less common these days – spreadsheets (or some kind of functionally-equivalent system), graphic design software, web browser use, and email. I’d assume that many people will learn this at home, but you’d be kind of disadvantaged in a number of fields if you don’t pick it up.

    Secondary education

    • More statistics. I saw one half-class as an elective at my school. This has been maybe one of the major things that I regret not having spent time picking up earlier and more of, and I’m pretty sure that a number of people don’t get basic statistics, based on the number of times I’ve seen arguments where people don’t believe polls because nobody’s ever introduced them to sampling.

    • Less calculus. The concepts are important; doing manual integration of symbolic equations is not, and that’s what I spent a lot of time in calculus on. When I went through, calculus was kind of the standard “mainline” math class if one wanted to take more math. I think in total I took three or four calculus classes in secondary and tertiary education, which is just excessive for nearly all fields, and a lot of what I was doing was not a great use of time in terms of even learning calculus. I remember that being absolutely driven home when I stopped by the office of the husband of the of one of my calculus professors once with a question about a project I was doing – he was also a mathematics professor – and watched him pull out Mathematica to do a simple integration. I asked him about it – I mean, the guy was married to a calculus professor, had a PhD in math – and he said “nobody has time to waste doing manual integration”. I can run the open-source Maxima package on my phone and desktop today, and it can do symbolic integration. There is no reason to have blown all the time I did manually doing calculus problems.

      Sorry, bit of a pet peeve.

    • Personal finance should be included.

    • I did not like the history curriculum in my secondary education at all. It was overwhelmingly rote memorization. The textbook was pretty decent – though we only covered a fraction of it, but I read through the rest and liked it. It wasn’t until I got to tertiary education that I had what I’d call a good history class – there was little memorization, and one mostly read content, discussed it, and wrote papers on it. Granted, that takes longer to grade, but there has to be some kind of way to improve on memorization. Today, I really enjoy a lot of history.

    • My home economics class was, as I recall, mostly cooking, sewing, and arts and crafts. The cooking was useful, the clothing repair was minimally useful, and the arts and crafts were a waste of time.

    • I don’t know how to fix it, but I think that literature was horrible. I read some of the books that were covered in literature classes prior to those classes and enjoyed them. Reading the same books later for school was a miserable experience.

    • I took a speech class that had a segment on propaganda techniques, to try to make people aware of them in their environment. I think was a good idea. I would guess that this isn’t widely available.

    • I’d like to see at least some form of basic economics at the secondary level. When I went through, economics was something that one only saw during tertiary education, not secondary.

    • I personally felt a bit overwhelmed when I hit formal proofs in tertiary education, as I hadn’t had much coverage in secondary education – IIRC, that was basically a portion of eighth-grade geometry. A friend had gone to a high school that provided much better coverage. Not all fields of study are going to require it, but I wish that I’d had more coverage in secondary education.

    • My secondary education did not offer coverage in some of the physics material, like electromagnetism, that I know that some schools do, which I regretted not having available.

    In general, I feel like I learned more in tertiary education than I did in secondary education per hour spent. On the other hand, I think that some of that was because the tertiary education curriculum was more self-driven and harder to grade. If you want to do that, that is going to add cost. Looking back, I kind of wish that my secondary education was generally closer to tertiary – more self-driven projects and such.

    Tertiary education

    My guess is that this differs a lot from person to person. I think that it’s harder to make recommendations that would apply to many people. I also think that in general, my tertiary education made better use of time than my primary or secondary education did – less that I’d change.

    • To fill an apparently-unrelated prerequisite, I took a class that covered some law, though I didn’t formally study law, and found that I picked up a lot of stuff that helped me understand what was going on later in life. I think that a lot of people would benefit from a low-level law course or two. It is not something that I would have planned for myself, but if I could go back in time, I think I would have told young me to go for it.

      I’d also add that the criminal law textbook we used was one of my favorite textbooks – it was dense from an information standpoint, and easy to understand.

    Overall

    • I have found that the wiki-style hypertext format plus having a browser with search engine available to work very well for learning material. I much prefer it to doing a linear run through a textbook. I think that it’s far preferable to listening to lectures, which run at real time (so you can’t easily slow if something’s confusing, and can’t zip through things that you already understand). I wish that tons of material had been available in that format when I was a kid, and think that more emphasis should be given it in education, if that isn’t already the case today.

    • Generally-speaking, I think that listening to lectures, especially in tertiary education, was a waste of time. I can get the same material more-quickly reading on my own than listening to someone do an ad-hoc presentation. Just assign the reading and have some kind of forum for taking questions.








  • In August 1993, the project was canceled. A year of my work evaporated, my contract ended, and I was unemployed.

    I was frustrated by all the wasted effort, so I decided to uncancel my small part of the project. I had been paid to do a job, and I wanted to finish it. My electronic badge still opened Apple’s doors, so I just kept showing up.

    I asked my friend Greg Robbins to help me. His contract in another division at Apple had just ended, so he told his manager that he would start reporting to me. She didn’t ask who I was and let him keep his office and badge. In turn, I told people that I was reporting to him. Since that left no managers in the loop, we had no meetings and could be extremely productive.

    They created a pretty handy app that was bundled with the base OS, and which I remember having fun using. So it’s probably just as well that Apple didn’t hassle them. But in all seriousness, that’s not the most amazing building security ever.

    reads further

    Hah!

    We wanted to release a Windows version as part of Windows 98, but sadly, Microsoft has effective building security.




  • I had some vague interest some time back in some of this some time back, the idea of a “zero-admin” network where you could just have random people plug in more infrastructure, install some software package on nodes, and routing and all would just work. No human involvement beyond plugging physical transport in.

    Some things to consider:

    • People will, given the opportunity, use network infrastructure as a DDoS vector. You need to be strong against that.

    • It’s a good bet that not everyone in the system can be trusted.

    • Not only that, but bad actors can collude.

    • Because transport of data has value, if this is free, you have to worry about someone else who provides transport for existing data just routing stuff over your free system and flooding it.

    • If the system requires encryption to mitigate some of the above issues (so, for example, one sort of mechanism might be a credit-based system where one entity can prove that it has routed some amount of data from A to B in exchange for someone else routing some amount of data from C to D – Mojo Nation, the project Bram Cohen did before BitTorrent, used such a system to “pay” for bandwidth), that’s going to add overhead.

    • If you want your network to extend to routing data onto the Internet, that’s going to consume Internet resources. Even if you can figure out a way to set up a neighborhood network, the people who, for example, run and maintain submarine cables are not going to want to do that gratis. And yeah, to some degree, you can just unload costs onto other users, the way that it’s common for heavy BitTorrent users to pay the same monthly rate as that little old lady who just checks her email, even though said heavy users are tying up a lot more time on the line. But if you are successful, at some point, this stops flying below the radar and ISPs start noticing that User X is incurring a greatly disproportionate degree of resource usage. I should note that there are probably valid use cases that don’t extend to routing data onto the Internet, but if you don’t permit for that, that’s a very substantial constraint.

    If anyone has to do something that they don’t want to do (e.g. run line from saturated point A to saturated point B), then you’re potentially looking at having to pay someone to do something, and then you’re just back to the existing commercial Internet system…which for most people, isn’t that expensive and does a reasonable job of moving data from Point A to Point B.

    From a physical standpoint, while different parts of the network can probably use different types of infrastructure, if you want sparse, cheap-to-deploy infrastructure over an area, my guess is that in many cases line-of-sight laser networks are probably your best bet, especially in cities. You can move data from point A to point B quickly through other people’s airspace without paying for it, today. Laser links come with some drawbacks: weather and such will disrupt them to some degree, so you have to be willing to accept that.

    The main application that I could think of for regional-only transport, avoiding routing onto the Internet, was some kind of distributed backup system. A lot of people have unused storage capacity. You can use redundant distributed data storage, the way Hyphanet does. You can make systems that permit one user to prove that they are storing a certain amount of data to let them build credibility by requesting hashes of data that they say that they’re storing. It won’t deal with, say, a fire burning down the whole area, but for a lot of people, basically having some kind of “I store your offsite data using my unused storage capacity in exchange for you doing the same for me, and we can both benefit enough to want to continue use of the system” system might be worthwhile. That’s also likely to permit for higher-latency stuff involving encryption and dealing with redundancy. I think that “Internet service for free” off such a system is going to be a lot harder.




  • It’s something in the charging circuitry. It works fine when it’s on wall power, but it just does not charge the battery.

    And it’s not the battery itself because I’ve tried getting new batteries for it. It’s something in the charging circuitry. It works fine when it’s on wall power, but it just does not charge the battery.

    At least some Dell laptops authenticate to the charger so that only “authentic Dell chargers” can charge the battery, though they’ll run off third-party chargers without charging the battery.

    Unfortunately, it’s a common problem – and I’ve seen this myself – for the authentication pin on an “authentic Dell charger” to become slightly bent or something, at which it will no longer authenticate and the laptop will refuse to charge the battery.

    I bet the charger on yours is a barrel charger with that pin down the middle.

    hits Amazon

    Yeah, looks like it.

    https://www.amazon.com/dp/B086VYSZVL?psc=1

    I don’t have a great picture for the 65W one, but the 45W charger here has an image looking down the charger barrel showing that internal pin.

    If you want to keep using that laptop and want to use the battery, I’d try swapping out the charger. If you don’t have an official Dell charger, make sure that the one you get is one of those (unless some “universal charger” has managed to break their authentication scheme in the intervening years; I haven’t been following things).

    EDIT: Even one of the top reviews on that Amazon page mentions it:

    I have a DELL, that has the straight barrel plug with the pin in it. THEY REALLY made a BAD DECISION when they made these DELL laptops with that type of plug instead of making it with a dog leg style plug. I have to replace my charger cord A LOT because the pin gets bent inside and it stops charging at that plug, but the rest of the charger is still good…


  • Up until the early 2000s, serial computation speed doubled about every 18 months. That meant that virtually all software just ran twice as quickly every 18 months of CPU advances. And since taking advantage of that was trivial, new software releases did, traded CPU cycles for shorter development time or more functionality, demanded current hardware to run at a reasonable clip.

    In that environment, it was quite important to upgrade the CPU.

    But that hasn’t been happening for about twenty years now. Serial computation speed still increases, but not nearly as quickly any more.

    This is about ten years old now:

    https://preshing.com/20120208/a-look-back-at-single-threaded-cpu-performance/

    Throughout the 80’s and 90’s, CPUs were able to run virtually any kind of software twice as fast every 18-20 months. The rate of change was incredible. Your 486SX-16 was almost obsolete by the time you got it through the door. But eventually, at some point in the mid-2000’s, progress slowed down considerably for single-threaded software – which was most software.

    Perhaps the turning point came in May 2004, when Intel canceled its latest single-core development effort to focus on multicore designs. Later that year, Herb Sutter wrote his now-famous article, The Free Lunch Is Over. Not all software will run remarkably faster year-over-year anymore, he warned us. Concurrent software would continue its meteoric rise, but single-threaded software was about to get left in the dust.

    If you’re willing to trust this line, it seems that in the eight years since January 2004, mainstream performance has increased by a factor of about 4.6x, which works out to 21% per year. Compare that to the 28x increase between 1996 and 2004! Things have really slowed down.

    We can also look at about the twelve years since then, which is even slower:

    https://www.cpubenchmark.net/compare/2026vs6296/Intel-i7-4960X-vs-Intel-Ultra-9-285K

    This is using a benchmark to compare the single-threaded performance of the i7 4960X (Intel’s high-end processor back at the start of 2013) to that of the Intel Ultra 9 285K, the current one. In those ~12 years, the latest processor has managed to get single-threaded performance about (5068/2070)=~2.448 times the 12-year-old processor. That’s (5068/2070)^(1/12)=1.07747, about a 7.7% performance improvement per year. The age of a processor doesn’t matter nearly as much in that environment.

    We still have had significant parallel computation increases. GPUs in particular have gotten considerably more powerful. But unlike with serial compute, parallel compute isn’t a “free” performance improvement – software needs to be rewritten to take advantage of that, it’s often hard to parallelize solving problems, and some problems cannot be solved in parallel.

    Honestly, I’d say that the most-noticeable shift is away from rotational drives to SSDs – there are tasks for which SSDs can greatly outperform rotational drives.


  • One problem, I think, is that if you have a lot of assets invested in a particular game style, then it’s costly to revise the game.

    I remember that it happened with the original Halo, where the game was massively revised across different genres during development. But I think that in general, once you’ve made the assets, it’s increasingly painful to dramatically change the game.

    I’ve also heard complaints that AAA studios are “risk-adverse” – but, honestly, I’d be kind of cautious about gambling a lot of asset money on an unproven game too.

    Whereas game genres that are extremely asset-light, like traditional roguelikes, often have pretty polished gameplay – the developers can cheaply iterate on the gameplay, because they don’t have to throw out much asset work.

    A lot of indie games today kind of fall into this camp, do stuff like low-res pixel art to save on asset costs.

    One thing I’ve kind of wondered about is whether maybe more of the video game industry should look more like a two-phase affair. You have games made on relatively small asset budgets, kinda more like indie games. Some fail, some succeed.

    But then when one is really successful, it becomes common for a studio that specializes in AAA titles to acquire it and do a high-production-value version of the game. That de-risks the game somewhat, since the AAA studio knows that it has a game with popular gameplay, and specializes in churning out a really high-value form.

    Now, okay. That doesn’t work with all genres. Some genres, like adventure games, you only really play once. Some games don’t do very well on the low-asset side – it’s hard to create an open-world FPS game on a budget.

    But there have been a lot of times that I’ve purchased a low-asset-cost game that I really like and then thought “I wish that there was more stuff on the asset side”, that I could go and pay more and get it.

    Like, for those low-res pixel art games, I’d like to have the ability to get full-res art. I’d often like more soundtracks. I’ve played a few games that have had outstanding voice acting, like Logan Cunningham in Transistor or Ron Perlman in Fallout: New Vegas, and I think that you could usually take many existing games and go back and stick good voice acting in and make the experience a lot better. A lot of 3D games could take more-extensive bowling and texturing.

    Yeah, some old games get remakes to take advantage of new technology, and sometimes they get fancier assets when that happens, but this isn’t that – I’m talking about taking a popular, relatively-current game with a limited asset budget and giving it a high-budget makeover.