I haven’t seen specifically cooking, but there have been quite a few papers about mixing task-instruction LLMs with task-execution robot arms (like they use in manufacturing) to perform simple tasks given only a plain English instruction. Eg, “pick up the red ball and place it in the blue bowl”. Very cool research but still very new.
Upholding international law by attacking unrelated commercial trade? Or by using child soldiers and human shields, or when they diverted international aid away from those in need and kept it for themselves, or when they blocked women in their controlled areas from travelling without a man, or when they tortured and raped female prisoners, or when they sentenced dozens of people to death for being gay, or when they raped, tortured and killed migrants from Africa. Were they doing what they could to uphold international law then too?
The issue is not that it can generate the images, it’s that the filtering a pre prompt for Gemini was coercing the images to include forced diversity into the gens. So asking for 1940s German soldier would give you multiracial Nazis, even though that obviously doesn’t make sense and it’s explicitly not what was asked for.
Basically override the default event for an anchor tag and use js to open a new tab to a given link.