If there were changes in 2020 to 2024 inclusive, then yes, I'd write it as 2020-2024. But if not inclusive, then I'd write 2021-2023.
litchralee
I'm not any type of lawyer, especially not a copyright lawyer, though I've been informed that the point of having the copyright date is to mark when the work (book, website, photo, etc) was produced and when last edited. Both aspects are important, since the original date is when the copyright clock starts counting, and having it further in the past is useful to prove infringement that occurs later.
Likewise, each update to the work imbues a new copyright on just the updated parts, which starts its own clock, and is again useful to prosecute infringement.
As a result, updating the copyright date is not an exercise of writing today's year. But rather, it's adding years to a list, compressing as needed, but never removing any years. For example, if a work was created in 2012 and updated in 2013, 2015, 2016, 2017, and 2022, the copyright date could look like:
© 2012, 2013, 2015-2017, 2022
To be clear, I'm not terribly concerned with whether large, institutional copyright holders are able to effectively litigate their IP holdings. Rather, this is advice for small producers of works, like freelancers or folks hosting their own blog. In the age of AI, copyright abuse against small players is now rampant, and a copyright date that is always the current year is ammunition for an AI company's lawyer to argue that they didn't plagiarize your work, because your work has a date that came after when they trained their models.
Not that the copyright date is wholly dispositive, but it makes clear from the get-go when a work came unto copyright protection.
The original reporting by 404media is excellent in that it covers the background context, links to the actual PDF of the lawsuit, and reaches out to an outside expert to verify information presented in the lawsuit and learned from their research. It's a worthwhile read, although it's behind a paywall; archive.ph may be effective though.
For folks that just want to see the lawsuit and its probably-dodgy claims, the most recent First Amended Complaint is available through RECAP here, along with most of the other legal documents in the case. As for how RECAP can store copies of these documents, see this FAQ and consider donating to their cause.
Basically, AXS complains about nine things, generally around: copyright infringement, DMCA violations (ie hacking/reverse engineering), trademark counterfeiting and infringement, various unfair competition statutes, civil conspiracy, and breach of contract (re: terms of service).
I find the civil conspiracy claim to be a bit weird, since it would require proof that the various other ticket websites actually made contact with each other and agreed to do the other eight things that AXS is complaining about. Why would those other websites -- who are mutual competitors -- do that? Of course, this is just the complaint, so it's whatever AXS wants to claim under "information and belief", aka it's what they think happened, not necessarily with proof yet.
At least on my machine, that link doesn't work unless I explicitly change it to HTTP (no S).
I vaguely recall a (probably apocryphal) story of an early washing machine-sized hard drive that lurched its way across the floor during a customer demo, eventually falling over once the connecting cables pulled taught.
That said, those hard drives did indeed move themselves: http://catb.org/jargon/html/W/walking-drives.html
It's for this reason I sometimes spell out the Bytes or bits. Eg: 88 Gbits/s or 1.44 MBytes
It's also especially useful for endianness and bit ordering: MSByte vs MSbit
The knot is non-SI but perfectly metric and actually makes sense as a nautical mile is exactly one degree meridian
I do admire the nautical mile for being based on something which has proven to be continually relevant (maritime navigation) as well as being brought forward to new, related fields (aeronautical navigation). And I am aware that it was redefined in SI units, so there's no incompatibility. I'm mostly poking fun at the kN abbreviation; I agree that no one is confusing kilonewtons with knots, not unless there's a hurricane putting a torque on a broadcasting tower...
No standard abbreviation exists for nautical miles
We can invent one: kn-h. It's knot-hours, which is technically correct but horrific to look at. It's like the time I came across hp-h (horsepower-hour) to measure gasoline energy. :(
if you take all those colonial unit
In defense of the American national pride, I have to point out that many of these came from the Brits. Though we're guilty of perpetuating them, even after the British have given up on them haha
An inch is 25mm, and a foot an even 1/3rd of a metre while a yard is exactly one metre.
I'm a dual-capable American that can use either SI or US Customary -- it's the occupational hazard of being an engineer lol -- but I went into a cold sweat thinking about all the awful things that would happen with a 25 mm inch, and even worse things with 3 ft to the meter. Like, that's not even a multiple of 2, 5, or 10! At least let it be 40 inches to the meter. /s
There's also other SI-adjacent strangeness such as the hectare
I like to explain to other Americans that metric is easy, using the hectare as an example. What's a hectare? It's about 2.47 acre. Or more relatable, it's the average size of a Walmart supercenter, at about 107,000 sq ft.
1 hectare == 1 Walmart
I'm surprised there aren't more suggestions which use intentionally-similar abbreviations. The American customary system is rich with abbreviations which are deceptively similar, and I think the American computer memory units should match; confusion is the name of the game. Some examples from existing units:
- millimeter (mm) vs thou (mil)
- meter (m) vs mile (mi)
- kilo (k) vs grand (G)
- kilonewtons (kN) vs knots (kn)
- statute mile (m/sm) vs survey mile (mi) vs nautical mile (NM/nmi) vs nanometer (nm)
- foot (ft) vs fathom (ftm)
- chain (ch) vs Switzerland (ch)
- teaspoon (tsp) vs tablespoon (tbsp)
- ounce (oz) vs fluid ounce (fl oz) vs troy ounce (ozt) vs Australia (Ozzie)
- pint (pt) vs point (pt)
- grain (gr) vs gram (g)
- Kelvin (K) vs Rankine (R; aka "Kelvin for Americans")
- short ton (t) vs long ton (???) vs metric tonne (t) vs refrigeration ton (TR)
FYI, the Intel code used to be here (https://github.com/intel/thunderbolt-utils) but apparently was archived a week ago. So instead, the video creator posted the fork here: https://github.com/rxrbln/thunderbolt-utils
Do you recommend dns.sb?
I know this is c/programmerhumor but I'll take a stab at the question. If I may broaden the question to include collectively the set of software engineers, programmers, and (from a mainframe era) operators -- but will still use "programmers" for brevity -- then we can find examples of all sorts of other roles being taken over by computers or subsumed as part of a different worker's job description. So it shouldn't really be surprising that the job of programmer would also be partially offloaded.
The classic example of computer-induced obsolescence is the job of typist, where a large organization would employ staff to operate typewriters to convert hand-written memos into typed documents. Helped by the availability of word processors -- no, not the software but a standalone appliance -- and then the personal computer, the expectation moved to where knowledge workers have to type their own documents.
If we look to some of the earliest analog computers, built to compute differential equations such as for weather and flow analysis, a small team of people would be needed to operate and interpret the results for the research staff. But nowadays, researchers are expected to crunch their own numbers, possibly aided by a statistics or data analyst expert, but they're still working in R or Python, as opposed to a dedicated person or team that sets up the analysis program.
In that sense, the job of setting up tasks to run on a computer -- that is, the old definition of "programming" the machine -- has moved to the users. But alleviating the burden on programmers isn't always going to be viewed as obsolescence. Otherwise, we'd say that tab-complete is making human-typing obsolete lol