top of page

How to Use Midjourney's Omni-Reference for Character Consistency

May 8

3 min read

0

72

0

midjourney blog post image
A Midjourney generated image using Midjourney Automation Suite

Midjourney often rolls out exciting new features. The Omni-Reference function is one of the latest, and it brings a powerful way to guide your image generation. This feature stood out immediately, especially for how well it handles consistent characters across different images.

What is Omni-Reference in Midjourney?

At its core, Omni-Reference lets you use an existing image as a guide. You can tell Midjourney, "Look at this picture, and try to make my new image похож on it in some way."

Midjourney already had basic image prompting, but it wasn't always precise or reliable, especially for keeping a consistent look for a person or object across multiple generations. Omni-Reference aims to make this process much better and more reliable.

Getting Started: The --cref Parameter

Using Omni-Reference is straightforward. When you create an image, you include a parameter called `--cref`. This stands for character reference.

The `--cref` parameter needs the URL of the image you want Midjourney to reference. So, your prompt might look like:

/imagine prompt a man in a forest --cref [Image URL]

Controlling the Reference Strength with --cw

You can control how much Midjourney is influenced by the reference image using the `--cw` parameter. This stands for character weight.

  • The `--cw` value ranges from 0 to 1000.

  • The default value is 100.

  • A higher `--cw` value (closer to 1000) means Midjourney will try harder to match the reference image's look.

  • A lower `--cw` value (closer to 0) means it will be less strict with the reference.

Finding the right `--cw` value often requires some trial and error. What works best depends on your reference image and what you're trying to create.

Testing different prompt variations, including the `--cw` parameter, can take time. Tools built to manage this process can help. Consider automating your Midjourney prompts to quickly test various settings and find the perfect output.

Seeing Omni-Reference in Action: Character Consistency

This is where Omni-Reference shows real power. The video demonstrates this by using the presenter's own photo as a reference.

When generating images of a man and using his photo with `--cref`, Midjourney was able to capture notable characteristics.

  • In a Victorian-era image, the created character kept features like eye shape, nose, and some facial hair details.

  • Trying different angles, like a top-down view, still resulted in images that held onto the reference face, even if imperfectly.

  • Generating a side profile photo using the `--cref` worked exceptionally well, creating an image that strongly resembled the reference photo from that angle.

Testing also included using a photo of a girl. After creating an initial image, that image was used as the `--cref` to generate pictures of the same girl in different clothing and settings (like walking on a beach). The results successfully kept the girl's face consistent while changing what she was wearing and her environment. This shows potential for creating character-driven stories or series.

Using Omni-Reference for Objects and More

While character consistency seems like a strong point, Omni-Reference isn't just for people. You can use it for objects too.

Initial tests with product photos had mixed results. Some attempts didn't keep the product reference well at all. Adjusting the `--cw` value helped get closer, but it shows that referencing objects might require more fine-tuning compared to characters.

However, referencing a piece of furniture worked much better. Using a photo of a specific cabinet as `--cref`, Midjourney created new images of furniture that kept the same texture and design style.

Referencing Multiple Elements

The feature also lets you reference multiple things from one image. The video showed an example using an image that had both a person and a specific sniper rifle.

By using this image with `--cref` and a high character weight (`--cw 600`), the generated images attempted to keep both the person's look and elements of the sniper rifle's style or color. While not always perfect, it shows the possibility of integrating complex reference into scenes.

Creating images that require maintaining multiple consistent elements or characters can get complex. To streamline this and handle the variations needed for scenes or sequences, the Midjourney Automation Suite by TitanXT can simplify prompt management and iteration.

Initial Impressions and What's Next

Based on these early tests, Midjourney's Omni-Reference is a valuable addition, especially for achieving character consistency, which has been a common request. It still needs testing and learning to understand exactly how it performs in all situations and how to best use the `--cw` parameter.

Midjourney updates often include improvements and new features. Keep experimenting with Omni-Reference and see how it can enhance your creations.

To make the most of new Midjourney features and streamline your creative process, check out the Midjourney Automation Suite on TitanXT.

Omni-Reference is a step towards more controlled and consistent image generation. If you need to create a series of images featuring the same character or maintain the look of certain objects, give this feature a try.

May 8

3 min read

0

72

0

Related Posts

Comments

Share Your ThoughtsBe the first to write a comment.
bottom of page
Midjourney Automation Suite - Automate your image generation workflows on Midjourney | Product Hunt