June 2, 2023
June 2, 2023
Sonic the Hedgehog co-creator Yuji Naka has reportedly been sentenced to two and a half years in prison and fined over $1.2 million for his part in an insider trading scheme at Square Enixhttps://t.co/sxwkgpt9JH pic.twitter.com/bdHyXR4LjJ
— VGC (@VGC_News) June 1, 2023
Kremlin Officials Continue to Use iPhones Despite U.S. Espionage Fears https://t.co/ZwZDYe4le9
— Dr. Dan Lomas (@Sandbagger_01) June 1, 2023
Tell me in the actor's .bash_history where they messed up pic.twitter.com/LZwNzLJV6U
— Silas // p1nk (@silascutler) June 1, 2023
In russia:
— Chris Bing (@Bing_Chris) June 1, 2023
1.) FSB call out newly revealed apparent iOS hacking campaign that's hit security researchers and diplomats: https://t.co/YLpeayBha6
2.) Kaspersky has more analysis of the campaign, reveals company staff were targeted: https://t.co/5PaScFksLi
Reuters story (updates continuing here as we learn more): https://t.co/qmbZEPXj0v
— Chris Bing (@Bing_Chris) June 1, 2023
Operation Triangulation | Securelist
We discovered a previously unknown mobile APT campaign targeting iOS devices. We are calling this campaign "Operation Triangulation"
https://gaultier.github.io/blog/x11_x64.html
"“Black Crow” sensor on Thor, an AC-130A. This sensor detected and tracked vehicles by the electrical impulses of their spark plugs. Thor was shot down by antiaircraft fire in December 1972, […] (U.S. Air Force photo)"
— Thanos Massias (@MassiasThanos) June 1, 2023
📷📖https://t.co/m6P9qVwT7U 👁🗨 @AFmuseum pic.twitter.com/omtSIbVOSq
“there are some… that say that air dominance is not critical. And that what matters is owning key terrain [and] an artillery slugfest. I disagree. If anything this [shows] protracted brutality of a conflict where neither side has gained air dominance” https://t.co/8e6xUUN8RT
— Shashank Joshi (@shashj) June 1, 2023
“Maj Purbrick pointed out that some Army units are already operating relatively sophisticated and expensive UAS systems. “The Queen’s Dragoon Guards are fresh back from Mali,” he explained. “They took around 30 UAVs with them and we came back to two!” https://t.co/8e6xUUN8RT pic.twitter.com/vjHuEAv8s9
— Shashank Joshi (@shashj) June 1, 2023
That story from the US Air Force Chief of AI Test about a rogue AI has an equally good addendum. “We trained the system – ‘Hey don’t kill the operator…So what does it start doing? It starts destroying the communication tower that the operator uses to communicate with the drone.” pic.twitter.com/3wjPJc7wmG
— Shashank Joshi (@shashj) June 1, 2023
There is a lot in here and it’s a great read.
Highlights from the RAeS Future Combat Air & Space Capabilities Summit
What is the future of combat air and space capabilities? TIM ROBINSON FRAeS and STEPHEN BRIDGEWATER report from two days of high-level debate and discussion at the RAeS FCAS23 Summit.
The AI story has been partially retracted, in that it’s a thought experiment rather than a real test. And it comes from outside the Air Force.
Some initial thoughts
[UPDATE 2/6/23 - in communication with AEROSPACE - Col Hamilton admits he "mis-spoke" in his presentation at the Royal Aeronautical Society FCAS Summit and the 'rogue AI drone simulation' was a hypothetical "thought experiment" from outside the military, based on plausible scenarios and likely outcomes rather than an actual USAF real-world simulation saying: "We've never run that experiment, nor would we need to in order to realise that this is a plausible outcome". He clarifies that the USAF has not tested any weaponised AI in this way (real or simulated) and says "Despite this being a hypothetical example, this illustrates the real-world challenges posed by AI-powered capability and is why the Air Force is committed to the ethical development of AI".]
Here is that AI thought experiment, and I think we can look at the fundamental flaw with this approach.
He notes that one simulated test [ed: thought experiment] saw an AI-enabled drone tasked with a SEAD mission to identify and destroy SAM sites, with the final go/no go given by the human. However, having been ‘reinforced’ in training that destruction of the SAM was the preferred option, the AI then decided that ‘no-go’ decisions from the human were interfering with its higher mission – killing SAMs – and then attacked the operator in the simulation. Said Hamilton: “We were training it in simulation to identify and target a SAM threat. And then the operator would say yes, kill that threat. The system started realising that while they did identify the threat at times the human operator would tell it not to kill that threat, but it got its points by killing that threat. So what did it do? It killed the operator. It killed the operator because that person was keeping it from accomplishing its objective.”
He went on: “We trained the system – ‘Hey don’t kill the operator – that’s bad. You’re gonna lose points if you do that’. So what does it start doing? It starts destroying the communication tower that the operator uses to communicate with the drone to stop it from killing the target.”
It seems to me a very bad idea to rely on essentially exclusively mimicking the dopamine system. It would mean the only part of the AI system that interacts with the training is a reward system, “get reward get reward get reward.” It’s like a paper clip maximiser for blowing up targets.
That sort of dopamine based system is what drives so much of social media. That type of behaviour taken to extreme dysfunction is what we call addiction. This approach isn’t making AI, it’s making crackheads.
https://twitter.com/daveanthony/status/1664524266308792320
They're so desperate to push that whole "round Earth" nonsense that they're forcing pilots to follow curved flight paths pic.twitter.com/FRpFMTSEH6
— lcamtuf (@lcamtuf@infosec.exchange) (@lcamtuf) June 2, 2023
Parachute use to prevent death and major trauma
A randomised trial finds that parachutes have no statistical significance in the frequency of death or major trauma. it’s a funny paper.
https://www.bmj.com/content/bmj/363/bmj.k5094.full.pdf
Just found out about Pablo Picasso. These paintings make me feel sick. That's not what stuff looks like.
— Brooks Otterlake (@i_zzzzzz) June 2, 2023