Bing's AI image creator (powered by DALL-E)

Oh wow. Almost always it not only dogs me but does the content warning thing for my just using words like "hand" or "face" or "hair", any body part at all. It censors the word "beach", "water", "shirt", etc. Just adding one of those words in the prompt sets it off.
Maybe after a user gets a certain number of content warnings, it gets hyper-sensitive? Maybe I'm in that situation.
That's really odd, I've never heard of it being that restrictive. Have you tried simply using a different account?
Another option is going back to absolute basics and seeing what truly sets it off. Just start with "man standing in forest," or whatever, and build out the prompt one bit at the time. "Man standing in forest wearing blue jeans and a white shirt," etc.
Sometimes it can be the order of words or a combination of words that sets it off. I've had prompts where everything worked fine as long as I allowed the subject to wear red. Any other colour would get dogged, and removing "red" got a content warning.
Btw, do you use StableDiffusion prompts in Bing?
 
That's really odd, I've never heard of it being that restrictive. Have you tried simply using a different account?
Another option is going back to absolute basics and seeing what truly sets it off. Just start with "man standing in forest," or whatever, and build out the prompt one bit at the time. "Man standing in forest wearing blue jeans and a white shirt," etc.
Sometimes it can be the order of words or a combination of words that sets it off. I've had prompts where everything worked fine as long as I allowed the subject to wear red. Any other colour would get dogged, and removing "red" got a content warning.
Btw, do you use StableDiffusion prompts in Bing?
Good idea about building the prompt. hadn't thought of that. You can use stable diffusion in Bing? I don't remember there being a separate section for that like on other AIs. Or is there something you type that achieves stable diffusion (i.e., negative prompts)? Any time I type "not x", it tends to have x in there.
 
Good idea about building the prompt. hadn't thought of that. You can use stable diffusion in Bing? I don't remember there being a separate section for that like on other AIs. Or is there something you type that achieves stable diffusion (i.e., negative prompts)? Any time I type "not x", it tends to have x in there.
Not really, but there are prompt generators for SD (you plug in a pic and it spits out a prompt to recreate that pic) and if you then plug the generated prompt into Bing they sometimes work but are extremely restrictive (can't change a word, can't move a comma). And because you described something similar I was just wondering.
No, the official "beauty" of Bing is the natural language usage instead of a language formula. But yeah, 'not X' tends to get you 'X'
 
I wonder if Cici disabled the image tool because of the content people were making, or if they just got sick of us racking up their OpenAI bills. Each image generated by DALL-E costs a fraction of a penny, and that adds up quick. I know I've done thousands of them just by myself...

89f380fdf982825f36bb2dc2a6316166_1719778452402166714.png~tplv-0es2k971ck-image-qvalue.jpeg
 
I got a message that it's now unable to generate adult content. RIP to another one
It just told me it can't generate images anymore. And the option on the panel has disappeared.
Literally right after I figured out how to get round its stupid geo block thing. Literally only got three images out the fucker. Devastated.
 
Working the monastery herb garden.
MG01.jpg
MG02.jpg
MG03.jpg
MG04.jpg


And again adapted for the celebs that still work for me:
Chris Evans:


Hugh Jackman:


Patrick Stewart:


Richard Armitage:


and Prince William:


DallE-3 still gives good bubble butt, it's just (semi-)clothed in Bing :)
 
Last edited by a moderator:
Not really, but there are prompt generators for SD (you plug in a pic and it spits out a prompt to recreate that pic) and if you then plug the generated prompt into Bing they sometimes work but are extremely restrictive (can't change a word, can't move a comma). And because you described something similar I was just wondering.
No, the official "beauty" of Bing is the natural language usage instead of a language formula. But yeah, 'not X' tends to get you 'X'
Is there a thread in this site for how to write an effective prompt where people share advice as Bing adapts? I think I need to go back to the basics because I bet it's in how I am phrasing a typical prompt at the very beginning, the way I am building the command from the first few words.
 
I get calls to shut down or censor services when people start making fake porn of real people. There's a consent issue there. It sucks that the Taylor Swift problem nerfed Designer for all of us, but I understand the reasoning behind it.

I don't understand drawing attention to people making fake porn of people that don't exist. I actually like a lot of 404 Media's work, and I have a lot of personal concerns about contemporary AI tech, but this particular issue (perhaps selfishly) just feels like empty moral panic to me. Object to the copyright issues with image generators? Fine. Object to diverting people's wallets away from working studio and OF performers? Okay. Object to the environmental impact of running AI servers? Sure. But none of that was mentioned, just "people are breaking the TOS by making porn of imaginary people!" Silly.
 
Am I right in thinking there are currently no other instances of dall-e we can use? Bing was good for a while, then cici while short lived was incredible…TensorArt is free but not great in comparison, also I’m not skilled to get the real benefit of it. I much prefer the style of the bing/cici art - is that specific to dall-e?
 
  • Like
Reactions: Klepty38
I much prefer the style of the bing/cici art - is that specific to dall-e?
Yes. DALL-E costs money to use, so really only companies like Microsoft and ByteDance can afford to offer free access to the public. You can use it through a paid ChatGPT subscription, but I've heard that interface is similarly locked down, with the chatbot filtering everything that goes between you and DALL-E.

As far as I know, the only way to get unfiltered DALL-E at this point would be to make your own app that talks to it via API, and pay for each generation yourself. I imagine even going this route, you might risk losing your account for violating OpenAI's terms. I don't know how closely they monitor it (maybe not so closely, if Cici was mass producing porn for over a month).
 
  • Like
Reactions: KingoftheGalaxy