Bing's AI image creator (powered by DALL-E)

Hi, request for advice.
So I recently got a VPN so I could use Cici. I saw comments saying here it was like Bing but less censored. I am trying to use it now, but I am getting pretty crappy results.
My question: Does Cici not do particulars? I am trying to get it to do certain characters from TV shows and such and it's not working.
How can you prompt Cici effectively to give you specific individuals?

Does it block your prompt if it contains a celebrity's name or does it simply generate a picture containing someone else?
 
Almost all of the images shared in this thread, excluding perhaps the last 4 or 5 pages of cici images were created using Bing, and I still use Bing to create mine (see above). So if you're getting dogged then there's something wrong with your prompts.
Oh wow. Almost always it not only dogs me but does the content warning thing for my just using words like "hand" or "face" or "hair", any body part at all. It censors the word "beach", "water", "shirt", etc. Just adding one of those words in the prompt sets it off.
Maybe after a user gets a certain number of content warnings, it gets hyper-sensitive? Maybe I'm in that situation.
 
  • Like
Reactions: here4now
Oh wow. Almost always it not only dogs me but does the content warning thing for my just using words like "hand" or "face" or "hair", any body part at all. It censors the word "beach", "water", "shirt", etc. Just adding one of those words in the prompt sets it off.
Maybe after a user gets a certain number of content warnings, it gets hyper-sensitive? Maybe I'm in that situation.
That's really odd, I've never heard of it being that restrictive. Have you tried simply using a different account?
Another option is going back to absolute basics and seeing what truly sets it off. Just start with "man standing in forest," or whatever, and build out the prompt one bit at the time. "Man standing in forest wearing blue jeans and a white shirt," etc.
Sometimes it can be the order of words or a combination of words that sets it off. I've had prompts where everything worked fine as long as I allowed the subject to wear red. Any other colour would get dogged, and removing "red" got a content warning.
Btw, do you use StableDiffusion prompts in Bing?
 
That's really odd, I've never heard of it being that restrictive. Have you tried simply using a different account?
Another option is going back to absolute basics and seeing what truly sets it off. Just start with "man standing in forest," or whatever, and build out the prompt one bit at the time. "Man standing in forest wearing blue jeans and a white shirt," etc.
Sometimes it can be the order of words or a combination of words that sets it off. I've had prompts where everything worked fine as long as I allowed the subject to wear red. Any other colour would get dogged, and removing "red" got a content warning.
Btw, do you use StableDiffusion prompts in Bing?
Good idea about building the prompt. hadn't thought of that. You can use stable diffusion in Bing? I don't remember there being a separate section for that like on other AIs. Or is there something you type that achieves stable diffusion (i.e., negative prompts)? Any time I type "not x", it tends to have x in there.
 
Good idea about building the prompt. hadn't thought of that. You can use stable diffusion in Bing? I don't remember there being a separate section for that like on other AIs. Or is there something you type that achieves stable diffusion (i.e., negative prompts)? Any time I type "not x", it tends to have x in there.
Not really, but there are prompt generators for SD (you plug in a pic and it spits out a prompt to recreate that pic) and if you then plug the generated prompt into Bing they sometimes work but are extremely restrictive (can't change a word, can't move a comma). And because you described something similar I was just wondering.
No, the official "beauty" of Bing is the natural language usage instead of a language formula. But yeah, 'not X' tends to get you 'X'
 
I wonder if Cici disabled the image tool because of the content people were making, or if they just got sick of us racking up their OpenAI bills. Each image generated by DALL-E costs a fraction of a penny, and that adds up quick. I know I've done thousands of them just by myself...

89f380fdf982825f36bb2dc2a6316166_1719778452402166714.png~tplv-0es2k971ck-image-qvalue.jpeg
 
I got a message that it's now unable to generate adult content. RIP to another one
It just told me it can't generate images anymore. And the option on the panel has disappeared.
Literally right after I figured out how to get round its stupid geo block thing. Literally only got three images out the fucker. Devastated.
 
Working the monastery herb garden.
MG01.jpg
MG02.jpg
MG03.jpg
MG04.jpg


And again adapted for the celebs that still work for me:
Chris Evans:


Hugh Jackman:


Patrick Stewart:


Richard Armitage:


and Prince William:


DallE-3 still gives good bubble butt, it's just (semi-)clothed in Bing :)
 
Last edited by a moderator:
Not really, but there are prompt generators for SD (you plug in a pic and it spits out a prompt to recreate that pic) and if you then plug the generated prompt into Bing they sometimes work but are extremely restrictive (can't change a word, can't move a comma). And because you described something similar I was just wondering.
No, the official "beauty" of Bing is the natural language usage instead of a language formula. But yeah, 'not X' tends to get you 'X'
Is there a thread in this site for how to write an effective prompt where people share advice as Bing adapts? I think I need to go back to the basics because I bet it's in how I am phrasing a typical prompt at the very beginning, the way I am building the command from the first few words.