• 0 Posts
  • 16 Comments
Joined 3 years ago
cake
Cake day: June 24th, 2023

help-circle

  • This has messed with me for the longest time. 24h just wraps around at 24, simple modulo 24 arithmetic.

    12h? The hour and am / pm wrap around independently, and hence I am always confused whether 12pm is supposed to be midnight or noon. Zero based would have made more sense (with x pm being x hours after noon…)



  • I would love if things weren’t as bad as they looked, but…

    Most of the destruction of buildings in Gaza is of empty buildings with no inhabitants. The IDF blows up or bulldozes buildings when they find booby traps in them, have tunnel entrances, provide military advantage, were used for weapons storage or command, were used as sniper or RPG nests, block lines of sight, to clear security corridors, space for military camps and operations, and so on. The list of reasons is long and liberally applied by the bulldozer operators and sappers on the ground.

    (emphasis mine) While destroying military targets is fair, pretty much every building blocks line of sight, including civilian housing, shops, hospitals, and so on. If applied liberally, this essentially amounts to destroy all buildings. Having your house (and nearby facilities, like shops, schools, hospitals) bulldozed will have a severe negative impact on your ability to live, even if you don’t die in the bulldozing or destruction of your house.

    The IDF warns before major operations and then almost all civilians leave the area. The evacuation of Rafah is a good example for this. There are also targeted attacks, usually by air, in non evacuated areas, but these are only responsible for a small fraction of the destruction.

    (emphasis mine) While the IDF does do this, and this avoids immediate death for many, it still deprives people of human right to housing. Furthermore, a warning does not provide those who evacuate / flee with housing, food and water - for these there are currently significant shortages, while acting on the warning will have a severe negative impact on being able to provide for oneself - one can only carry so much. A disregard for innocent human lives isn’t just civilian deaths, it is also the deprivation of resources that one needs to live.


  • It says ‘a neighborhood’ not 'one neighborhood '. Furthermore, in the article, it specifically mentions it represents other neighborhoods in Gaza.

    A neighborhood provides an example of the disregard for innocent human lives behind the Israeli attacks, with visual proof provided by satellite imagery, even if it is one of many.

    Stating one neighborhood would imply it is the only one. While the NY Times does not have the best track record, it is needlessly reductive for an article that shows what is happening in Gaza. Especially as a picture of a neighborhood can actually be more impactful than the whole: close enough that you can see individual places where people leave, far enough to see the extent of destruction.


  • Also ImageTragick was a thing, there are definitely security implications to adding dependencies to implement a feature in this way (especially on a shared instance). The API at the very least needs to handle auth, so that your images and videos don’t get rotated by others.

    Then you have UX, you may want to show to the user that things have rotated (otherwise button will be deemed non-functional, even if it uses this one-liner behind the scenes), but probably don’t want to transfer the entire video multiple times to show this (too slow, costs data).

    Yeah, it is one thing to add a one liner, but another to make a well implemented feature.


  • It does have a dictionary entry though, e.g. "the branch of computer science that deal with writing computer programs that can solve problems creatively”, and I would argue that this definition fits.

    The definition “something that lets a computer perform tasks that are thought to require intelligence” depends on the person, and whether they think something required a form of intelligence. Accounting for all variables over a large distance so you hit your target seems like it requires a reasonable amount of intelligence to me.

    It is a extremely generic term though, almost like using ‘software package’. It is more often used as a buzzword than something that provides significant clarification about how it works.



  • Saying AI = LLMs is an severe oversimplification though. LLMs and image generators are subsets of AI that are currently most prominent and with which is most commonly knowingly being interacted with, but pretty much every formal definition is wider than that. Recommendation algorithms, as used on YouTube or social media, the smart (photo) search, are further examples of AI that people interact with. And fraud detection, learning spam filters, abnormality (failure) detection, traffic estimation are even more examples. All of these things are formally defined as AI and are very much commonplace, I would not call them niche.

    The fact that LLMs and image generators are currently the most prominent examples does not necessarily exclude other examples from being part of the group too.

    Using AI as a catch all phrase is simply a case of overgeneralization, in part due to the need of brevity. For some cases the difference does not matter, or is even beneficial. For example, ‘don’t train AI models on my art’ would only marginally affect applications other than image generation and image analysis, and covers any potential future applications that may pop up.

    However, statements ‘ban AI’ could be easily misconstrued, and may be interpreted in a much wider manner than what the original author may have intended. There will be people with a variety of definitions to what does or does not constitute AI, which will lead to miscommunication unless it is clear from context.

    It probably wouldn’t hurt clarifying things specifically and talking about the impact of a specific application, rather than discussing what is (or is not) to be classified as AI.