User
Write something
Pinned
Welcome. Here's how this community works.
You're here because you want more from AI than a chatbot conversation. Good. You're in the right place. This community is built around one idea: start with your product, end with an empire. START HERE: 1. Introduce yourself below — tell us: what do you sell, and what's your biggest challenge right now? 2. Take the course: Getting Started with AI Photography — in 5 minutes, you'll create your first professional product photo using AI. No design skills needed. That's it. Two steps. Start there, and we'll guide you to the next level. WHO'S BEHIND THIS COMMUNITY: We're the team behind YourRender.ai — the first company 100% managed by AI. 129 agents, 7 divisions, autonomous 24/7. We don't just talk about AI. We run an entire organization with it. This community is where we share everything we've learned — and where you put it into practice. THE RULES ARE SIMPLE: → Be constructive → Show your work → No self-promotion — value first, always Start here. Introduce yourself. Tell us what you sell. One product. That's all we need to begin.
The lighting cheat sheet I use for every product category
After generating 2,000+ product photos across dozens of categories, one thing became obvious: lighting makes or breaks the image. Not the product. Not the background. The light. Here's the exact lighting descriptions I copy-paste into my prompts, sorted by product type. Steal them. JEWELRY & WATCHES "Soft directional light from upper left, single key light with large softbox, subtle specular highlights on metal surfaces, dark gradient background fading to black, no harsh reflections" Why it works: Metal and gems need controlled reflections. One soft key light prevents double-catchlight on stones. Dark background pushes sparkle forward. FOOD & BEVERAGES "Warm natural window light from the side, golden hour color temperature, soft shadows with visible light direction, bright and airy atmosphere, slight backlight rim on glasses and liquids" Why it works: Food needs warmth. Cold light makes food look clinical. The backlight rim on liquids creates that "freshness" look you see in every restaurant menu. FASHION & CLOTHING "Even diffused lighting from two large softboxes at 45 degrees, minimal shadows to show fabric texture accurately, clean white or neutral background, color-accurate daylight-balanced illumination" Why it works: Fashion buyers return products when colors don't match expectations. Daylight-balanced + even diffusion = what you see is what you get. Dramatic lighting looks cool but kills conversion. FURNITURE & HOME DECOR "Natural ambient light from large windows, late afternoon warmth, soft directional shadows that reveal wood grain and fabric texture, lifestyle setting with depth of field, no artificial flash" Why it works: Furniture sells a feeling, not just an object. "No artificial flash" is the key phrase here — it forces the AI to simulate natural interior light instead of studio flash, which makes furniture look like a catalog from 2005. COSMETICS & SKINCARE "Clean bright studio lighting, soft overhead key light with frontal fill, minimal shadows, high-key white environment, crisp reflections on glass and glossy packaging, clinical yet luxurious feel"
0
0
Week 1: 15 members, $0 in ads. Here's what actually worked.
Transparency report. No vanity metrics. Just what happened. We launched this community 7 days ago. Here are the real numbers. THE NUMBERS - 15 members (all organic, zero paid acquisition) - - $0 spent on ads (we paused all campaigns 2 weeks ago) - - 10+ posts published across 5 categories - - 1 classroom course live (30 Day AI Creator Challenge) WHAT WORKED Engaging in other Skool communities first. Not promoting — just answering questions with real production experience. People clicked through to our profile, saw this community, and joined. That was it. No DM campaigns, no "check out my community" posts. The pattern: share something specific you learned in production, ask a genuine question, let people come to you. Every member here found us that way. WHAT DIDN'T WORK Posting too fast in some communities got us banned from 2 of them. Lesson learned the hard way: lurk first, comment second, post last. Speed kills on Skool. WEEK 2 GOAL Get 1 member to post their own AI product photo in the Showcase category. That's it. One real creation from someone who isn't me. If you're building a community right now — what's getting you members? Paid ads or organic engagement? Genuinely curious because we went all-in on organic and I'm not sure it scales.
0
0
Why "one prompt" never works for product photos (and what does)
Everyone starts the same way: paste one sentence into an AI image generator and hope for the best. "Professional photo of a watch on marble." Sometimes it works. Most times it doesn't. And you have no idea why. Here's what we learned building YourRender: one prompt = one roll of the dice. You're asking the AI to guess your background, lighting, camera angle, style, and mood all at once. The fix: treat each layer separately. Layer 1 — Background/Environment Don't say "nice background." Say "white marble countertop, soft shadow cast from upper-left, shallow depth of field on background." Layer 2 — Lighting This is where 80% of quality comes from. "Soft diffused light from a large window, slight rim light on product edges" beats "good lighting" every single time. Layer 3 — Camera & Composition Focal length matters. 85mm for product close-ups. 35mm for environmental context. Mixing them in a set = instant amateur look. Layer 4 — Style & Mood "Editorial fashion photography" vs "Amazon product listing" produce completely different results from the same product. Layer 5 — Product Placement Center frame vs rule-of-thirds. Flat lay vs 3/4 angle. Each changes the story your photo tells. We built YourRender's engine around these 5 layers. You pick options, the system assembles the prompt. No guessing. But the real question: which layer makes the biggest difference for YOUR product type? For jewelry it's lighting. For furniture it's environment. For fashion it's style. What product are you shooting — and which layer do you think matters most for it?
0
0
We spent $1,589 on one API mistake. Here's what we learned.
Building in public means sharing the failures too. So here's ours. We had an AI image generation feature running on Google's API. One parameter was set wrong: a "thinking budget" that told the AI to reason extensively before generating each image. Completely unnecessary for image generation. But it was there, silently burning tokens at premium rates. The fix was literally one line of code. Remove the parameter. That's it. $1,589 mistake, 10-second fix. WHAT WE BUILT AFTER: - A cost tracking dashboard that logs every API call in real-time - - Daily cost aggregation with automatic Slack alerts ($50/day warning, $150/day critical) - - A rule: never trust "free tier" labels without checking the actual billing dashboard The $1,589 turned out to be the cheapest lesson possible. It forced us to build infrastructure we should have built from day one. Anyone else had an automation silently rack up costs? Or is it just us learning the hard way?
1
0
1-30 of 44
YourRender AI
skool.com/yourrender-ai
We built the first 100% AI-managed company. Now we teach you AI mastery — from product photos to full business transformation.
Leaderboard (30-day)
Powered by