About the LeiaDream category

LeiaDream is the first official app integrated with Stable Diffusion in partnership with StabilityAI. LeiaDream is exclusively on Lume Pad 2 and Nubia Pad 3D. LeiaDream allows you to use any text prompts you dream up to generate beautiful AI images that you can view instantly in 3D.

You can use this section of the Leia Forum to:

  • Discuss downloading or using LeiaDream on Lume Pad 2 or Nubia Pad 3D

  • Share your LeiaDream creations with others

  • Ask questions about LeiaDream or Leia Credits in relation to LeiaDream

  • Share your favorite prompt tips about LeiaDream

  • Give your feedback on the LeiaDream app

3 Likes

Where is it? Not in your app store, no link provided on your app page. Not sure if it is local or cloud based processing (I assume the latter).

Currently I am using Midjourney and ‘Leia Conversion’ to create the depth map which I then export as a Facebook image/depthmap that I bring in GIMP and convert to 16bit precision and create a new bump map layer from a desaturated image, then duplicate this as well as the depth map to create positive and negative offset for the depthmaps that are then merged to create a final highly detailed depth map. The image & depthmap are brought into Stereomkr to create an SBS that is brought back into Leia Player for display and uploading to Leia Pix.

Not sure how complicated your process is, but I would be happy to try it out on my LumePad.

Thanks,

Mike

@mebalzer It is in the Leia Appstore on Lume Pad 2 and Nubia Pad 3D.

It’s using StabilityAI’s API in partnership with them.

StabiltyAI can run on local or cloud based hardware. My question is it cloud based? If it is, there should be no reason it cannot run on the original Lumepad. Also, if it is cloud based who hosts the servers and what is the data retention policy?

It can in theory run on local computers, but you should check the VRAM requirements. There’s not a mobile device in the world that can run it locally. Yes, as implied by us using their API, it is using their cloud.

There are multiple reasons it can’t, including the 2D to 3D conversion model we run locally on the generated images can’t run on the original Lume Pad, and the API level and Android APIs used to develop the app are too high to run on the original Lume Pad.

I’m assuming we can use prompt syntax and commands as you would in stable diffusion or StabilityAI? Sorry, I have very little experience with it so I don’t know the terminology. But I am impressed at the results with very little experience.

It’s the same API as DreamStudio by StabilityAI, so yes.

1 Like