top of page

Make Your Midjourney Characters Look the Same in Different Scenes

Apr 30

4 min read

0

1

0

midjourney blog post image
A Midjourney generated image using Midjourney Automation Suite

Have you ever made a character in Midjourney that you really liked? Then you tried to put that same character in a new picture, maybe wearing different clothes or in a new place. Usually, it was hard to get the face, hair, and overall look to match exactly. It felt random.

Good news! Midjourney just added a new feature to help you keep your characters consistent. It's easy to use and works well.

Introducing the Character Reference Feature

Midjourney's new tool for character consistency is called `--cref`. This lets you use an existing image of a character as a guide for new images.

Look at this example. You have a character image like this one.

Using the `--cref` feature, you can create new pictures that keep the original character's face, hair color, and style. It works even if the character is in a different pose or scene.

Midjourney recommends using images you made in Midjourney as your reference image. You can use other images, but results might be best with Midjourney-made ones.

How to Use --cref

Here are the steps to use this new feature:

  • Get the web address (URL) for your reference character image. If the image is in Discord, open it in your web browser and copy the address from the address bar.

  • Start typing your prompt for the new scene you want the character in. Describe the scene, pose, or action.

  • At the end of your prompt, add `--cref` followed by a space and the image URL you copied.

So, it looks like this: `/imagine prompt: [your scene description] --cref [image URL]`

When you run the prompt, Midjourney might give you a shorter link address you can use later if you want. You should see results where your character from the reference image appears in the new scene you described.

Getting consistent characters can really speed up your workflow, especially if you're trying to tell a story or create many images with the same people. To take your Midjourney image generation to the next level, you might want to explore tools that automate parts of this process. Check out the TitanXT Midjourney Automation Suite to see how you can generate more images more efficiently.

Handling Unwanted Details

Sometimes, the character reference image might have other things in it, like a specific background or objects. Midjourney might try to include those extra details in your new scene.

For example, if your character reference image had balloons in the background, the new images might also show balloons, even if you didn't ask for them.

To avoid this, it's often a good idea to use a reference image where the character is alone in front of a plain background, like gray or white. You can create such an image by prompting Midjourney to put your character in front of a neutral background.

When you use this cleaner character reference image, the extra elements from the original scene should not appear in your new images.

Control How Much the Character Matches

You can control how closely Midjourney tries to match the reference character. This is done with the Character Weight parameter, `--cw`. It works similar to the style weight (`--sw`) parameter.

The `--cw` value goes from 0 to 100.

  • `--cw 0`: This is the default setting. Midjourney focuses mostly on the character's face.

  • `--cw 100`: Midjourney tries hard to match the face, hair color, hairstyle, *and* clothing from the reference image.

You can try different values, like `--cw 50`, to find the right balance for your needs. A higher weight means a stronger attempt to match the full look of the reference character.

Experimenting with character weight helps you get just the right level of likeness. For users creating many different scenes, managing these parameters can become complex. Tools like the TitanXT Midjourney Automation Suite are designed to handle these different settings and help you generate large batches of consistent images effectively.

Matching Multiple Characters

What if you want two different specific characters in the same scene?

You can do this in a couple of steps:

  • First, generate your scene with the first character using their `--cref` link. Midjourney might make both characters in the scene look like your first character at this stage.

  • Use Midjourney's Vary Region tool. Select the area in the image where you want the second character to appear.

  • In the prompt box that pops up for Vary Region, add a description for the second character's position or action. Then, add the `--cref` link for the *second* character *and* the `--cref` link for the *first* character again. You include both `cref` links here.

  • Run the Vary Region prompt.

Midjourney should now replace the selected area with your second character, while keeping the first character consistent as well. This lets you create scenes with multiple specific characters you've already designed.

Conclusion

The new `--cref` feature in Midjourney V6 is a big step forward for creating consistent characters across different images and scenes. Whether you need a character for a short story, a video project, or just a series of pictures, this tool makes it much easier to keep their look the same.

By understanding how to use `--cref`, including the `--cw` parameter and isolating your reference character, you can gain much more control over your Midjourney output. For anyone looking to streamline their Midjourney work, especially when dealing with character consistency across many images, automated solutions can provide a lot of help. Consider checking out the TitanXT Midjourney Automation Suite to see how it can assist in managing your projects and outputs.

Apr 30

4 min read

0

1

0

Related Posts

Comments

Share Your ThoughtsBe the first to write a comment.
bottom of page
Midjourney Automation Suite - Automate your image generation workflows on Midjourney | Product Hunt