Last updated on
Mar 18, 2024
- Substance 3D home
- Home
- Getting Started
- Interface
- Interface overview
- The Home Screen
- 2D and 3D Viewport
- Sidebars
- Panels
- Tools and Widgets
- Preferences
- Filters
- Filters overview
- Custom Filters
- Compound Filters
- Generators
- Adjustments
- Tools
- Tools overview
- Atlas Creator
- Atlas Splitter
- Channels Generation
- Channel Switch
- Clone Stamp
- Crop tool
- Delight (AI Powered)
- Height to AO
- Height to Normal
- Image To Material
- Make it Tile
- Match
- Multiangle To Material
- Normal to Height
- Paint Wrap *missing*
- PBR Validate
- Perspective Correction
- Tiling
- Transform
- Warp
- Warp Transform
- Upscale
- HDRI Tools
- Wear and Finish
- Technical Support
- Technical Support overview
- Exporting the log file
- Configuration
- Technical Issues
- Data or project issues
- Filter issues
- Interface issues
- Performance issues
- Stability issues
- Startup issues
- Features and workflows
- Pipeline and integrations
- Scripting and Development
- 3D Capture
- Release Notes
- FAQ
FAQ - Frequently Asked Questions
Beta Access
- How do I access the beta builds?
Existing Substance 3D customers can log in to Creative Cloud desktop with their Adobe ID and find them in the Betas section. - Will Steam version have access to the beta apps?
The beta builds are only available to Adobe Substance subscribers.
Generative AI in Substance 3D Apps
- What 3D Model data set was used to train the Substance 3D generative features?
While Text to Texture and Generative Background are features available inside Substance 3D applications, they are text to image workflows. Both Text to Texture and Generative Background are powered by the Firefly Image Model which has been trained on licensed content, such as Adobe Stock, and public domain content where copyright has expired. - How do generative credits work with Substance 3D apps?
Text to Texture in Sampler and Generative Background in Stager will consume 0 credits for a limited time.
Adobe, Firefly and generative AI
Complete Firefly FAQ is here: https://www.adobe.com/products/firefly.html#faqs
- As an Adobe customer, will I have copies of my content included as part of the Firefly model?
No, copies of customer content are not included in the Firefly models. - As an Adobe customer, will I have my content automatically used to train Firefly?
No. We do not train on any Creative Cloud subscribers’ personal content. For Adobe Stock contributors, the content is part of the Firefly training dataset, in accordance with Stock Contributor license agreements. - Does Adobe plan to compensate Adobe Stock contributors whose content is used in the dataset to train Firefly models, and what will the compensation plan look like?
We have a compensation model for Adobe Stock contributors. For more information, see the Adobe Stock FAQ. - What is Adobe doing to ensure AI-generated images are created responsibly?
As part of Adobe’s effort to design Firefly to be commercially safe, we’re training our initial commercial Firefly model on licensed content, such as Adobe Stock, and public domain content where copyright has expired. Additionally, as a founding collaborator of the Content Authenticity Initiative (CAI), Adobe is setting the industry standard for responsible generative AI. The CAI is a community of media and tech companies, NGOs, academics, and others working to promote adoption of an open industry standard for content authenticity and provenance. - What is Adobe Firefly Image 2 Model?
Adobe Firefly Image 2 is the next version of the image generation model within the Firefly family of models.