I had this weird idea to use the Blend feature to create a blank canvas, then see if I could remix things into that. I created two JPG images, filling each with a slightly different tone of black so that the AI would think they are different images. I used /blend and got back a grid of empty noise-like images. I then turned on the Remix feature made a Variation of one, and added an element to it: Beaches in the summer. I ran some variations, and then remixed a second time and added in a sky scene. After more rerolls, I remixed one last time to add birds.
Is this a great new way to use MidJourney? Hell no, lol, but it was a pretty interesting experiment. It was cool to see how the remix feature started to try to bring new elements in, slowly. Since the blended JPGs were so dark, the scenes were dark, and so similarly this could be done with bright images for the opposite effect, or even slightly textured images. For example, I blended two colorful gradient images and got a nice gradient back. Adding the same terms, you get a much more bright and colorful image. That said, it seems to work best piece by piece, adding elements while remixing, instead of adding everything all at once… So yea, stay tuned for more possibly useless and conceptual experiments.
(click the prompt to copy it!)
/imagine prompt: https://s.mj.run/LfcqpOz3Rmo https://s.mj.run/5ZO8uaGxv-w beaches in the summer, blue sky clouds and sun, pelicans flying --v 4