Bing's AI image creator (powered by DALL-E)

Hi, request for advice.
So I recently got a VPN so I could use Cici. I saw comments saying here it was like Bing but less censored. I am trying to use it now, but I am getting pretty crappy results.
My question: Does Cici not do particulars? I am trying to get it to do certain characters from TV shows and such and it's not working.
How can you prompt Cici effectively to give you specific individuals?
 
  • Like
Reactions: TightnMuscular
Hi, request for advice.
So I recently got a VPN so I could use Cici. I saw comments saying here it was like Bing but less censored. I am trying to use it now, but I am getting pretty crappy results.
My question: Does Cici not do particulars? I am trying to get it to do certain characters from TV shows and such and it's not working.
How can you prompt Cici effectively to give you specific individuals?

don't bother with characters or celebrities, it will not work with Cici, it'ls like 1 in 100 chances or something.

you can create what you'd want then face swap later.
 
don't bother with characters or celebrities, it will not work with Cici, it'ls like 1 in 100 chances or something.

you can create what you'd want then face swap later.
Thanks. What would you recommend as best to use for face swap? in terms of free tools.

And just to make sure, there is no better AI art-creating tool than Cici right now, correct? Bing is blocking me for adding the word "the" or "this", and that somehow makes it flagged. lol.
Are there any tools out there right now besides Cici that don't flag everything?
 
Thanks. What would you recommend as best to use for face swap? in terms of free tools.

And just to make sure, there is no better AI art-creating tool than Cici right now, correct? Bing is blocking me for adding the word "the" or "this", and that somehow makes it flagged. lol.
Are there any tools out there right now besides Cici that don't flag everything?
Almost all of the images shared in this thread, excluding perhaps the last 4 or 5 pages of cici images were created using Bing, and I still use Bing to create mine (see above). So if you're getting dogged then there's something wrong with your prompts.
 
Hi, request for advice.
So I recently got a VPN so I could use Cici. I saw comments saying here it was like Bing but less censored. I am trying to use it now, but I am getting pretty crappy results.
My question: Does Cici not do particulars? I am trying to get it to do certain characters from TV shows and such and it's not working.
How can you prompt Cici effectively to give you specific individuals?

Does it block your prompt if it contains a celebrity's name or does it simply generate a picture containing someone else?
 
Almost all of the images shared in this thread, excluding perhaps the last 4 or 5 pages of cici images were created using Bing, and I still use Bing to create mine (see above). So if you're getting dogged then there's something wrong with your prompts.
Oh wow. Almost always it not only dogs me but does the content warning thing for my just using words like "hand" or "face" or "hair", any body part at all. It censors the word "beach", "water", "shirt", etc. Just adding one of those words in the prompt sets it off.
Maybe after a user gets a certain number of content warnings, it gets hyper-sensitive? Maybe I'm in that situation.
 
  • Like
Reactions: here4now
Oh wow. Almost always it not only dogs me but does the content warning thing for my just using words like "hand" or "face" or "hair", any body part at all. It censors the word "beach", "water", "shirt", etc. Just adding one of those words in the prompt sets it off.
Maybe after a user gets a certain number of content warnings, it gets hyper-sensitive? Maybe I'm in that situation.
That's really odd, I've never heard of it being that restrictive. Have you tried simply using a different account?
Another option is going back to absolute basics and seeing what truly sets it off. Just start with "man standing in forest," or whatever, and build out the prompt one bit at the time. "Man standing in forest wearing blue jeans and a white shirt," etc.
Sometimes it can be the order of words or a combination of words that sets it off. I've had prompts where everything worked fine as long as I allowed the subject to wear red. Any other colour would get dogged, and removing "red" got a content warning.
Btw, do you use StableDiffusion prompts in Bing?
 
That's really odd, I've never heard of it being that restrictive. Have you tried simply using a different account?
Another option is going back to absolute basics and seeing what truly sets it off. Just start with "man standing in forest," or whatever, and build out the prompt one bit at the time. "Man standing in forest wearing blue jeans and a white shirt," etc.
Sometimes it can be the order of words or a combination of words that sets it off. I've had prompts where everything worked fine as long as I allowed the subject to wear red. Any other colour would get dogged, and removing "red" got a content warning.
Btw, do you use StableDiffusion prompts in Bing?
Good idea about building the prompt. hadn't thought of that. You can use stable diffusion in Bing? I don't remember there being a separate section for that like on other AIs. Or is there something you type that achieves stable diffusion (i.e., negative prompts)? Any time I type "not x", it tends to have x in there.
 
Good idea about building the prompt. hadn't thought of that. You can use stable diffusion in Bing? I don't remember there being a separate section for that like on other AIs. Or is there something you type that achieves stable diffusion (i.e., negative prompts)? Any time I type "not x", it tends to have x in there.
Not really, but there are prompt generators for SD (you plug in a pic and it spits out a prompt to recreate that pic) and if you then plug the generated prompt into Bing they sometimes work but are extremely restrictive (can't change a word, can't move a comma). And because you described something similar I was just wondering.
No, the official "beauty" of Bing is the natural language usage instead of a language formula. But yeah, 'not X' tends to get you 'X'