🥞
Glif Docs/Guide
  • Getting Started
    • 👋What is Glif?
    • 💡What can I do with a glif?
      • 🏃Run a glif
      • 🔌Build a glif
      • 🔀Remix a glif
      • 🗣️Comment on a glif
      • 🔳Embed a glif
    • ⚒️How do I build a glif?
      • 📽️Video tutorial: Building a simple image generator
      • 🟰Using variables
    • ⚙️Profile Settings
    • 🪙Credits and Payments
    • ❓FAQs
  • Blocks
    • 🙋Inputs
      • ✍️Text Input Block
      • 🖼️Image Input Block
      • 📋Multipick Block
    • 🪄Generators
      • 📃Text Block
      • 🖼️Image Block
      • ➡️Image to Text Block
        • Florence2Sam2Segmenter
    • 🧰Tools
      • 🔀Text Combiner Block
      • 🔬JSON Extractor Block
    • 💅Styling
      • 🎨HTML Block
      • 🖼️Canvas Block
    • 🧑‍🔬Advanced/Experimental
      • 🎙️Audio Input Block
      • ↔️Glif Block
      • 🌐Web Fetcher Block
      • 🔊Audio Spell
      • 🧱ComfyUI Block
      • 📡Audio to Text Block
      • 🎥Video Input Block
      • 🔧JSON Repair Block
  • Apps
    • 🎨Glif It! Browser Extension
  • Glif University
    • 🎥Video Tutorials
      • 🐲How To: D&D Character Sheet Generator
      • 🧠How To: Expanding Brain Meme Generator
      • 🦑How To: Occult Memelord Generator
      • 🥸How To: InstantID Portrait Restyle Glif
      • 🕺How To: Style and Pose a Character with InstantID + Controlnet
      • 😱How To: Create a Simple Cartoon Portrait Animation Glif (LivePortrait + Custom Blocks)
      • 👗How to Create a Clothing Restyler App (IP Adapter, ControlNet + GPT Vision)
      • 🤡How to Create a 4+ Panel Storyboard/Comic (Flux Schnell)
      • 🎂How to Create a Recipe Generator with Accompanying Pictures
      • How to Use JasperAI Depth Controlnet on Flux Dev
      • 🦸‍♂️How to Make a Consistent Comic Panel Generator
    • 🧑‍🏫Prompt Engineering 101
    • 🖼️ControlNet
    • 📚AI Glossary
  • API - for Developers
    • ⚡Running glifs via the API
    • 🤖Using AI Assistants to build with the Glif API
    • 📙Reading & writing data via the API
    • 🗾Glif Graph JSON Schema
    • 📫Embed player & custom webpages
    • 📫Sample code
    • ❓What can I make with the Glif API?
      • Browser Extensions
      • Discord Bots
      • Games
      • Social Media Bots
      • Experimental Projects
  • Policies
    • 👨‍👩‍👧‍👦Community Guidelines
  • Programs
    • 🖼️Loradex Trainer Program
  • Community Resources
    • 🧑‍🤝‍🧑Resources Created by Glif Community Members
  • Contact Us
    • 📣Send us your feedback
    • 🚔Information for law enforcement
Powered by GitBook
On this page
  1. Glif University
  2. Video Tutorials

How to Use JasperAI Depth Controlnet on Flux Dev

In this video we walk you through how to create a controlnet powered workflow, featuring flux dev. and a custom LoRA!

PreviousHow to Create a Recipe Generator with Accompanying PicturesNextHow to Make a Consistent Comic Panel Generator

Last updated 6 months ago

Step by step

  1. Click Build to start a new Glif project.

  2. Add an image input block.

  3. Add a text input block.

  4. Add a ComfyUI block.

  5. Copy the provided JSON into the Comfyui block.

  6. (Optional) Adjust the LoRA input node in the Comfyui block.

  7. (Optional) Adjust the controlnet strength nodes in the Comfyui block.

  8. Test and publish!

Insert this into the JSON field in the Comfyui block.

{
  "3": {
    "inputs": {
      "seed": 376385039400309,
      "steps": 28,
      "cfg": 1,
      "sampler_name": "dpmpp_2m",
      "scheduler": "beta",
      "denoise": 1,
      "model": [
        "78",
        0
      ],
      "positive": [
        "47",
        0
      ],
      "negative": [
        "47",
        1
      ],
      "latent_image": [
        "83",
        0
      ]
    },
    "class_type": "KSampler",
    "_meta": {
      "title": "KSampler"
    }
  },
  "6": {
    "inputs": {
      "text": "{input1}, L1d4r style",
      "clip": [
        "78",
        1
      ]
    },
    "class_type": "CLIPTextEncode",
    "_meta": {
      "title": "CLIP Text Encode (Prompt)"
    }
  },
  "7": {
    "inputs": {
      "text": "",
      "clip": [
        "78",
        1
      ]
    },
    "class_type": "CLIPTextEncode",
    "_meta": {
      "title": "CLIP Text Encode (Prompt)"
    }
  },
  "8": {
    "inputs": {
      "samples": [
        "3",
        0
      ],
      "vae": [
        "27",
        0
      ]
    },
    "class_type": "VAEDecode",
    "_meta": {
      "title": "VAE Decode"
    }
  },
  "9": {
    "inputs": {
      "filename_prefix": "ComfyUI",
      "images": [
        "8",
        0
      ]
    },
    "class_type": "SaveImage",
    "_meta": {
      "title": "Save Image"
    }
  },
  "11": {
    "inputs": {
      "guidance": 4,
      "conditioning": [
        "6",
        0
      ]
    },
    "class_type": "FluxGuidance",
    "_meta": {
      "title": "FluxGuidance"
    }
  },
  "27": {
    "inputs": {
      "vae_name": "ae.sft"
    },
    "class_type": "VAELoader",
    "_meta": {
      "title": "Load VAE"
    }
  },
  "30": {
    "inputs": {
      "image": "{image-input1}"
    },
    "class_type": "LoadImage",
    "_meta": {
      "title": "Load Image"
    }
  },
  "31": {
    "inputs": {
      "preprocessor": "Zoe-DepthMapPreprocessor",
      "resolution": 1024,
      "image": [
        "75",
        0
      ]
    },
    "class_type": "AIO_Preprocessor",
    "_meta": {
      "title": "AIO Aux Preprocessor"
    }
  },
  "47": {
    "inputs": {
      "strength": 0.6,
      "start_percent": 0,
      "end_percent": 0.3,
      "positive": [
        "11",
        0
      ],
      "negative": [
        "7",
        0
      ],
      "control_net": [
        "48",
        0
      ],
      "image": [
        "31",
        0
      ],
      "vae": [
        "27",
        0
      ]
    },
    "class_type": "ControlNetApplyAdvanced",
    "_meta": {
      "title": "Apply ControlNet"
    }
  },
  "48": {
    "inputs": {
      "control_net_name": "flux/jasperai_depth_flux_dev.safetensors"
    },
    "class_type": "ControlNetLoader",
    "_meta": {
      "title": "Load ControlNet Model"
    }
  },
  "72": {
    "inputs": {
      "unet_name": "flux1-dev-fp8.safetensors",
      "weight_dtype": "default"
    },
    "class_type": "UNETLoader",
    "_meta": {
      "title": "Load Diffusion Model"
    }
  },
  "73": {
    "inputs": {
      "clip_name1": "t5xxl_fp8_e4m3fn.safetensors",
      "clip_name2": "clip_l.safetensors",
      "type": "flux"
    },
    "class_type": "DualCLIPLoader",
    "_meta": {
      "title": "DualCLIPLoader"
    }
  },
  "75": {
    "inputs": {
      "width": 1024,
      "height": 1024,
      "upscale_method": "nearest-exact",
      "keep_proportion": false,
      "divisible_by": 2,
      "width_input": [
        "79",
        0
      ],
      "height_input": [
        "79",
        1
      ],
      "crop": "disabled",
      "image": [
        "30",
        0
      ]
    },
    "class_type": "ImageResizeKJ",
    "_meta": {
      "title": "Resize Image"
    }
  },
  "78": {
    "inputs": {
      "repo_id": "glif/LiDAR-Vision",
      "subfolder": "",
      "filename": "Lidar.safetensors",
      "strength_model": 1,
      "strength_clip": 1,
      "model": [
        "72",
        0
      ],
      "clip": [
        "73",
        0
      ]
    },
    "class_type": "HFHubLoraLoader",
    "_meta": {
      "title": "Load HF Lora"
    }
  },
  "79": {
    "inputs": {
      "image": [
        "30",
        0
      ]
    },
    "class_type": "SDXLAspectRatio",
    "_meta": {
      "title": "Image to SDXL compatible WH"
    }
  },
  "83": {
    "inputs": {
      "width": [
        "79",
        0
      ],
      "height": [
        "79",
        1
      ],
      "batch_size": 1
    },
    "class_type": "EmptyLatentImage",
    "_meta": {
      "title": "Empty Latent Image"
    }
  }
}

🎥
Check out the example Glif here!