Benjamin Guler of EvolveLab joins the podcast to talk about their tools for Revit, design workflows, automation, interoperability, prompt-based AI rendering, the novel differences between AI rendering and more traditional rendering tools, how this novel workflow affects rendering as a decision-making tool in the design workflow, how it has the potential to shift where visualization fits into the design phase in contrast to existing rendering tools, and other topics.
- Ben on LinkedIn
- Ben on Twitter
- EvolveLab website
- EvolveLab forums
- EvolveLab on LinkedIn
- EvolveLab on Twitter
- EvolveLab on Instagram
- Related episode: TRXL 006: ‘Doomed to Fail’, with Bill Allen
Connect with Evan:
Watch this episode on YouTube
131: ‘It’s Prompting Us’, with Benjamin Guler
Evan Troxel: [00:00:00] Welcome to the TRXL podcast. This is Evan Troxel, just a quick reminder, upfront here to sign up for my AEC tech email newsletter at TRXL.co. The latest issue has articles about a new podcast that I'm working on called confluence that you might be interested in. The rhino version eight beta release and ChatGPT's new image generator that's coming soon to compete with the likes of Midjourney. Again, click the link in the show notes or head over to TRXL.Co and click on one of the subscribe buttons.
In this episode, I welcome Benjamin Guler. Ben is a partner and the CTO at EvolveLab. Coming from an architectural background, Ben serves as a liaison between the AEC industry and the computer science world and works on app design, computational design, generative processes, process management, and standardization, both in their product offerings for the AEC industry and for the [00:01:00] clients they work with directly as consultants.
In this episode, Ben and I talk about EvolveLab's tools, including Helix, Glyph, Morphis, and Veras, design workflows, automation, interoperability, AI based rendering via your model and a simple text-based prompt, the novel differences between AI based rendering and more traditional rendering tools. Which also include texturing and lighting workflows. How this affects rendering as a decision making tool in our design workflow, how this potentially shifts where rendering fits into the design phase compared to other rendering tools used up until now, and other topics. Of side note here Veras comes up pretty often in my TRXL AEC tech newsletter, which I talked about a minute ago, because it seems like it gets updated every couple of weeks. So even what we talked about in this episode might already be outdated or built upon and enhanced. So again, sign up by visiting [00:02:00] trxl.co or click the link in the show notes.
As always, this was a great conversation. So without further ado, I bring you Ben Guler.
Evan Troxel: Ben, welcome to the podcast. Great to have you.
Ben Guler: Thank you. It's good to be here. Thanks for the invite, and uh, it's awesome to be here with you.
Evan Troxel: You're the second Evolve Laber on the podcast. Bill Allen was on the show before and, uh, caught up with you guys a little bit at the AIA conference in San Francisco, and I was like, we have to make this happen because, uh, there's so many things going on. I, I keep seeing Evolve Lab. You guys keep releasing tools that look super useful, um, but also just intriguing.
I think, you know, you're, you're taking the AI rendering thing head on and we'll get to that in the conversation. Before we get there, uh, I would love to just hear the story about how you got involved at Evolve Lab and like, what's [00:03:00] that been like for you over the past few years, because you were a recently made partner there and so, uh, you've really settled in obviously.
So give us this kind of, the, the story, a little bit of a backstory on how that all happened.
Ben Guler: Sure. Yeah. Thank you. Uh, so basically I started my career path in architecture. So went to u I C for that and I've always kind of gravitated towards the technical side. Um, in the beginning, actually even in studio, like I really kind of cared about visualization, so that was kinda like a big deal for me.
Like I try to use the best hardware 'cause I was a video game geek and I would always like get the best graphics card and get the, the best video games
and upgrade my, you know, my, my hardware to get, you know, the best quality. So because I had that, I knew how to like, you know, the rendering software, how to model and all that.
So that was kinda always kind of a, a focus of mine when I was kind of in school. So basically in the beginning of my career I was hired as an intern, intern to do renderings. So that's kinda like my, this
[00:04:00] connects to kinda other things we're probably gonna talk about. So I'm mentioning this
Evan Troxel: right.
Ben Guler: Uh, so I, I switched a few different jobs and, uh, kind of in, in the last one before Evolve Lab, I kind of migrated to a BIM manager and then that's where I started to use a lot of dynamo writing Python scripts and stuff like that. And then, Found the limitations of that beyond scripting and started to write add-ins in, in C and other languages. And so I was gravitating a lot more towards the technical side, and I was kind of doing that during like nights and weekends. And then I was kind of given like a day a week to do that, which was like always looking forward to that.
Like, oh, okay, I get it's Friday. I could do some coding today, actually part of my job. So it was really great. Um, and so like I really wanted to do that kind of full-time and I, um, I've known about Evolve Lab throughout the years and I've even seen like posts and other people that have gone there like, oh, I'm not worthy.
I can't join this firm yet because like, I don't know C Shop yet, or something like that. So, o over over time, once I got my skills, as I thought, uh, to a level of competence, uh, I, I'm [00:05:00] like, okay, a job posting is there. I'm just gonna apply for it and see what happens. And so I had a few different calls and joined Evolve Lab and you know, I don't regret it ever since. So that's kind of the beginnings
of it. Yeah.
Evan Troxel: How many Bill Allen, uh, au courses did you, or workshops did you go to? Uh, because I think that's, that's like where a lot of people got introduced to Bill and his amazing classes or panel discussions or round tables that he would host at au always super highly rated. Uh, did, did you attend those as well?
Ben Guler: So I actually didn't go to AU much, but I watched the recordings
Evan Troxel: Okay.
Ben Guler: So the, yeah, the first time I went
Evan Troxel: were there in solidarity?
Ben Guler: So like the first time, uh, like I, I watched the videos like Bill Allen, I knew about him, watched a bunch of his videos and
they were pretty good back then. Like posting, like on YouTube.
Even now, we still post through regularly. So I knew that from
that like free tutorial, like, oh, I'm learning that. I want to know how to do that. Or like Revit categories or something that,
you know, all [00:06:00] sorts of like tricks that were kind of uh, there. Uh, and then I think my first day was like in 2019 and then, you know, I got to go and like, oh, this is great.
This is awesome.
Evan Troxel: That's cool. Yeah,
fun. I mean, Bill's just a great character
technology sphere, and I mean, obviously he's really approachable, super. Just a, an amazing guy.
And so I I can see why you wanted to work there
too. I mean, beyond technology, right? I mean, it's
like a, it is just,
Ben Guler: Good culture.
Evan Troxel: a great culture.
Yeah, for sure.
So, okay, so take us through kind of, I, the elephant in the room is Veris, right? Veris is the, the, the thing, and you guys have been releasing other tools as well, and I, I mean there's, we can definitely get into that stuff as well, but AI based kind of image generation based on your model in various tools, right?
Not just Revit, um, but also Rhino and SketchUp. And so like, like, okay, before we get into like your, [00:07:00] how, how, I mean, I hoping talk about how Veris actually works, but, but you, like, where'd this idea come from? How did you guys start to approach this idea how before you even got to a, an actual product that that got released?
Ben Guler: Yeah, so the way we really started was, um, and I, I might go through some of the products just for reference, but like, for
example, we've had Helix, which is on interoperability, uh, glyph, which is auto documentation. And, uh, we just recently released publicly morphous. Uh, and this is kind of actually the way we're planning to release them, but then kind of various came in between before, uh, uh, Morpheus. But the idea is that we are looking at the industry, uh, so we're, we come from the industry, right? We're ac
Evan Troxel: Mm-hmm.
Ben Guler: and we've practiced in it and have done, you know, grassroots. We've done the job basically. And so, Whenever there's anything emerging technology, we take that and try to make that accessible to like, you know, if I had that when I was a designer, great.
If I had a tool for that, so like, for example, this latest [00:08:00] one, which is morphous, it takes in kind of what, um, Revit GD does, which is, you know, um, um, uh, generative design basically, and generative design algorithms. Uh, and we've just built our own and packaged it in a very simplistic way and accessible way that has a, a, a easier way to access through the UI for end users.
You don't have to know Dynamo, you don't have to, you know, connect things. You could still get all your design options, things like that. And so with Veris, you know, as we're kind of always, uh, it stays kind of pretty true to the evolve lab, d n a, where like we just take kind of emerging technologies and we play with them a little bit, see if there's anything there, what can we do with them? And, and so with, with Veras, it kind of started with like what's out there and we saw kind of the rise of mid journey and like that was really like in 2022 last year. Beginning of it was like kind of taking over all social media, like almost every other
post you see was like that. Like, okay, there's something there.
So we started playing with that. Like, okay, this is pretty cool actually. And it wasn't that good actually back then either. It was just kind of like, it looked like [00:09:00] a Photoshop. And so, you know, as um, deli two and other, uh, uh, tools came out, even the open source ones and other, other open source kind of models, we start to play with those libraries and those technologies.
And we say, okay, these are very flattened technologies in terms of like, they impact the whole world. Like, it, it, you know, it's for every industry. It's like a jewelry design or any, any industry that exists basically are going to be
affected by this thing. So like, us being in this space and, you know, making applications for the ac, how could we, let's just, we wanna stay afloat and make sure not just afloat, but ahead actually, and start to play with this technology, say, see how it could impact our current products. And then if there's a proof of concept for something that we're, we don't have a product for it, for example. And so that's kind of how it started, where we were just kind of Looking, we actually first looked at some kind of like morphous integration, and we're looking at other integrations for our current applications for, you know, machine learning models. And so in that effort we kind of like, well, we could actually use this for just rendering. And [00:10:00] myself actually, actually having that background and rendering and playing with it a lot and, and testing the different libraries, I'm like, this is pretty cool. Like, I could see myself just, if we package this in a way that's very easy to access and, and have end users, uh, you know, play with it and see any value in it, like, let's just go for it.
So then we kind of spent a few months just putting together like, okay, let's see what this looks like. And then this pipeline that we build, infrastructure that could, you know, 'cause everything is, uh, cloud based, it's cloud computing. Uh, how could, how, how do you kind of create all that infrastructure as we could use it for other machine learning models, like large language models, uh,
besides, you know, uh, diffusion image, diffusion based models. So,
uh, it was almost kind of, I. Serving to upgrade our tech stack, basically. And then there was also a product at the end here because of other, other ideas that we had. So that's kind of how it, it started. And then we obviously we're keeping up with all like new releases and things that are, are coming out, we're integrating into our application and, uh, iterating on top of that and adding, uh, new ideas [00:11:00] on top of it, so.
Evan Troxel: It seemed like I keep seeing things coming outta school where people say, I just went to X and Y jury for their final presentations. the students are using mid journey, all the students are using AI rendering. I mean, I, to me, I think that's, that's pretty interesting, right? Because at some level, and I think what we all saw with , Prompt based image generation was not a lot of control
over the, the outcome, Right.
Or you have to spend just as much time as you would creating the geometry and applying materials and lighting and entourage and all those things. You create rendering as you would kind of crafting the prompt, right? The whole idea of prompt
Ben Guler: right.
Evan Troxel: no matter how, if that, if that phrase triggers you or not,
Ben Guler: No, we use it.
all the time.
Evan Troxel: You. I know. Yeah. And I'm talking to the audience right now,
Ben Guler: Ah,
Evan Troxel: because I know there's people every time they hear that, [00:12:00] they're like rolling their eyes and Oh my God. But it, it, it does take some experience, right, to create a prompt that can give you
anything close to what you
want. And many people don't have the patience for that, right?
Where they have the patience to spend hours and hours and hours modeling the details, applying the textures, applying the lighting, hitting, render, walking away, coming back, you know, if it's real time or if it's not real time, if it's vray or scape or whatever it is,
you've got time to do that, right.
But you, maybe you don't have the time to put into kind of crafting those prompts to get what you want.
So it's interesting to me to hear when people go to these school juries and they're saying, everyone's using this, all the students are using it, they're going crazy with this stuff. And
Ben Guler: Yeah.
Evan Troxel: I. The examples that I've seen from the people who do this every single day, right? Like there's Hassan Rega on, I think that's how you say his name, but he, he's always posting on LinkedIn.
I follow him on Instagram, adopter of Mid Journey, [00:13:00] gone through all the different versions of that and like really pointing out the nuance of what's changed between the different releases and what's good at what it's not good at, where it's gotten better, where it's regressed in different areas.
And I just think like that the level of, uh, commitment to that kind of a, a new . Paradigm, new tool set in working is exactly what it is. Like it's a commitment to actually get to that. And now you take the tool that you've guys, you guys have created that kind of democratizes access to, I mean, you can get more into the nuance.
I kind of assume it's a, it's a diffusion based and control net kind of a, you know, soup that you've put together
on top of a program that people are already building their model in, which gives them then the bones that the image is, is then using or the, the prompt is then building on top of, and that to me makes it.
So you don't have to [00:14:00] commit so much to getting into becoming a quote unquote prompt
engineer, but more of like a video gamer. Like,
like to go back to where you started, it's like this is a skin for my model. Right?
And this is a skin for my game. And I, and I have a kid who loves to play Minecraft. And obviously we've seen all the, the stuff out there with Fortnite over the past few years and, and like just people love, like re-skinning their real-time engine with a, a different look.
It keeps seeing and that's what now architecture students are doing with their models. And, and so maybe you can just speak to all that, that I just spit out on the, on the table there, but like, just where, where does that, where does that get you thinking about your response to all that?
Ben Guler: No, I think, I think that's exactly where our, uh, vision is for all that. It's like, there's, like you've mentioned already, like the prompt engineering aspect is pretty technical. Like mid journey prompts don't work the same way for deli or stable fusion.
They're different syntax that [00:15:00] behave in a different way. And as you get different models out, you actually get Different syntax also because of the way they're being trained, actually the different clip models that are actually used to understand the image and auto, you know, generate the different, you know, tags, meta tags. So basically that is part of that is the challenge actually.
People that have the patience to even, and this is something we've learned actually a lot with Helix, well, people, like in average, like the time span that you have to get someone to like, okay, this is like, you know, to get someone to like get to write a prompt is even, that's a challenge actually. So if you could even take that away and try to kind of, what we're trying to do is like, don't even have a problem if possible. Like, and still get that. Yeah. get
Evan Troxel: I think it's like, this is a game in itself,
Like you, if you, if you made this mad libs where it's like, okay, here's, here's a, there's already something there, and then you fill in this one blank and then there's a little bit more of the sentence, and then you fill in this blank
and then you, and then you read it [00:16:00] out loud, which is hitting, go
like that. And everybody laughs because they love what comes back out of it. Like that to me
is is kind of the what you're up against, right? You've gotta create, uh, we, I keep hearing like this, this, the prompt is the perfect
UI because it's conversational. People already know how to ask a question, but this is different.
Like this isn't the large language model based
Ben Guler: Right,
Evan Troxel: Chat GPT
thing where I can ask it a question and get a response. I'm not asking my model
a question, what do you want to be model? Right? I am, I'm actually giving it cues
and, and all of a sudden people are faced with this blank page of the prompt now and there's just as much.
as there is in going up against a blank page when they start a design
Ben Guler: Exactly.
Evan Troxel: It's like some people are like, give it to me, I'm ready.
And other people are like, I wanna think about this for three weeks.
Ben Guler: And that's the nuance that we're trying to, and the balance we're trying to strike. We do want the
ability to get those people to do that. To a certain degree. Uh, and then [00:17:00] also the other people that are just kinda like, ah, I'm just exploring. See if I could get something. And then once you get to like that aha moment, like, oh wow, this actually
could produce really cool results.
How do, how do I do that? And like,
that's kind of maybe one of the things that we're working on now to even simplify it further, like super simplistic way and using cues from the, you know, if you're in SketchUp, using cues from the SketchUp model, extracting more metadata from there, from Revit.
Uh, we extracted already quite a few metadata from there that you don't have to know about.
Like it's just there, you're in Revit. Okay, let's see what we can get from there. Or Rhino. And so like minimizing that is,
yeah, it's 'cause we're in that space basic and we could kind of understand what I would, you know, if I were to do this, understand what I would need for, for from the model. And so yeah, it's that challenge basically trying to get like a package where it's simplistic, it's very easy to use, but still allows you for that dynamic like exploration to go deeper if you want to start to kind of be more manual and, and kind of Uh, crafting and sculpting what you're trying to do,[00:18:00]
Evan Troxel: it thinks, I just go back to clippy, right? It looks like you're writing a business letter. It looks like you're
designing a school. I don't know what those things are, but it's like you guys start to extract some kind of information based on what it's
kind of watching people make. And then maybe there's a, a, it's like, it looks like you're trying to do this.
Would you like help with
that? I think that that's kind of interesting, so, okay. Before we, we continue down kind of the technical implementation, let, let, let's take a step back.
Ben Guler: Yeah.
Evan Troxel: Obviously you're making a products, and, but that's not the answer to this question, the answer to the selling product. Uh, I think the, the answer to this question, I hope is, is gonna be more interesting than that.
Why, why are you interested in this problem and this solution? Why are you trying to get people to use AI to render three D models in the architecture space? What's, what's the benefit? What's the value in that? What are you, what are you guys seeing there?
Ben Guler: Yeah. There's A few different ones. It could be, okay, so right now we have a, if you look at [00:19:00] the visualization space, and this is kind of the way I see it, is, is almost kinda like the tip of the iceberg. Like that's just visualization. And what
we're trying to do with, or what everyone's trying to do with visualization up, up until now and, and from now forward is you're trying to simulate reality like physics, like, you know, like there's unbiased methods where, or path tracing that, you know, you're trying to simulate what's there in a physics bay based way. So then that requires, you know, materiality, things like that. And with machine learning, it's such a kind of reverse that like, no, I, I don't start from there where I'm kind of building the molecules and then you kind of render the molecules, if you will, as an
analogy. You're actually just looking at what do you usually see as a photograph. And based on
that I could simulate it from a different endpoint. I could actually simulate reality for you. I just need to know like where your walls or your things. So it could be that like hyperrealism. And, and I totally, I'm biased obviously, but I believe in that that is the future. I think like machine learning models, in [00:20:00] fact there's a lot of R&D and a lot of investment in creating like, real time solutions for that too.
And video as you've probably already seen
already. So it could become much more like, if you could think about it, you could just have it. And then the challenge now is like, well in our current, you know, kind of, uh, layer of are the technology that exists, what do we have to do? Well, we have to build CDs from that.
Okay. How do you, how do you get it from that just vision in your mind? And again, some of the tools that we have kind of already are pieces of the pulse that will allow us to do that. But essentially it's kind of a, a grander vision where you could envision it, you could see that thing and you could the pipeline to get that into a way where it's like, okay, I can actually build this.
Like I can actually get
documents that, that, uh, You know, so it's visualization's just like, it's like a portion of it and we kind of are building a product for that. That also sponsors the grander vision to like, okay, how can we go further than that? How do we go to geometry creation? How do we go to, uh, and again, we kind of have pieces of the puzzles siloed out a little bit.
[00:21:00] Kind of not interconnected now, but that's kind of a path that we're looking at that this could lead to.
Evan Troxel: Yeah. It's interesting to think about it, kind of identifying the endpoint. I mean, I guess we've always done this to a, to a degree in architectural design, right? Which is you're, you're doing visualization along the way to kind of show where you are, but you're always also trying to show a vision of what it's going to become at some point, right?
And so the idea even of realtime rendering is, is like painting a picture that doesn't exist yet.
Uh, applying, making a, an environment, like a fully immersive environment. It could be in vr, it might be on a two dimensional screen that you walk through in realtime with scape or any other realtime rendering program, right?
It's like this is a vision of where we're headed, where we want to get to. But you're saying like, this is, it's even more. Realistic than that, right? Because what it, it paints in these details based [00:22:00] on whatever was used to train it. And it's kind of like Chat GPT where it's just, uh, putting a word after another word based on probability.
Right? of what's been said before. And you're doing that with imagery where you're doing that with pixels, you're doing that with elements in a rendering. It's like, okay, I've been trained on this model. And usually when it's looking through glass, it sees this kind of thing. On the other side of it,
I think it's so, it's so weird,
Like to me it's, it is such a weird paradigm and and when you create an image like this that then somebody latches onto like a client or even just a designer, somebody during the process, and they're like, yeah, that's, that's where I want to head to. And then you have to figure out how to get there. Right?
And that's where
I. The value of an architect and understanding space and understanding how things go together actually comes into play. I think what's really game changing about this is you democratize these tools enough, and it's [00:23:00] already there. Anybody
can spool up mid journey. Anybody can download Discord and, and go into mid, mid journey is when we, we as architects hated it before when they came to us with a SketchUp model.
Now they're gonna come with, uh, this idea that is like in their minds figured out, done, this is what I want. I want this a-frame, one that looks over this site that looks exactly like my site that I own. Right? And, and they're gonna come to an architect with that and they're, and, and so should we be scared of that?
Or is it, should we just expect it? And then figure out how, how to give them their vision, figure out how to deliver that project to them. This is the kind of thing that. Everybody's kind of grappling with right now. It's like there, there's a, there's a big fear there, and there's also a lot of opportunity there.
Ben Guler: Yeah, it's really interesting. It's, it's kind of like , is it gonna take away our jobs? Or something like that. I think that's kinda like where this kind of alludes to, it's a parallel maybe questions around that. But,
uh, yeah, it's [00:24:00] very interesting. I think
The way I see it is the cat's outta the bag. It's gonna happen regardless or not the
technology kind of expanding towards that. But again, I think the way I see it, at least in the shorter term is it's like people should be using that technology to be able to have the higher bandwidth.
that way, you know, you stay kind of competent and not competent, but competitive, uh, so that, you know, you kind of, uh, um, you know, have access and you know, you don't, I think it's a bandwidth thing.
And also with the shortage of, you know, like the amount of architects that are able to even design, like a lot of the stuff that's actually in, in the cities and that is being built is a lot of it is cookie cutter that is not actually even
that designed. So having clients and, and architecture itself be more accessible to, to, you know, kinda broader masses, I think it's a good thing overall. And I think, uh, over time it's just going to be allowing, uh, you know, It's just gonna use bandwidth basically. And people could, like, actually, they [00:25:00] still have to go through the process and still have to get through, you know, uh, all, all the processing out, uh, you know, for the permit system. And, you know, having someone to actually physically build it.
That could be kind of a limitation, like the actual,
you know, maybe robotics are gonna assist there, but that would be kind of the next limiting factor. Like right now, it's not like it's the opposite. Like architecture is very serving a very select number of people in the world. Uh,
designed buildings, right?
And if that could increase, I mean, that just makes, you know, a beautiful, more beautiful world. Like it's, I think it's a net positive, a net good for, for the world to have that.
Evan Troxel: I, I agree with that. I, I, I wonder what you think about, uh, the output that I'm seeing from that we're all seeing from these image generators influencing what people want in their buildings. Because I think what we've seen over time with the cookie cutter buildings that you just mentioned is a. Complete watering down, [00:26:00] dumbing down of building systems, building materials.
I mean, every strip mall looks exactly the same. They go up really fast. They're done really cheaply. They don't last very long. I mean, that to me, they're all the different components of cookie cutter.
And when you look at these images, I mean, there's an amazing level of artistry in them. You, it's depending on the prompt, right?
like people who aren't prompting towards minimalist, modern architecture, people who are prompting more towards fantastical stuff. I wonder if that's gonna influence what people want to build. Because in many regards, we've lost that over the years, right? It's because of the, uh, this watering down, but also because of the cost of things.
And, and it's not to say that, These fantastical images don't cost a lot to build. I mean, obviously when I see the curving glass and the
buildings made out of feathers and all of these things, right? It's like, uh, nobody can afford to
build that. But at the same time, people are going to [00:27:00] want to see those kinds of spaces be built, that kind of architecture be built because it's like, wow, that looks incredible.
I get excited about that. It does benefit. Aesthetically, but in more ways than that, uh, the, the culture that, that inhabits that space, right. The people that inhabit that space. So it does, it's interesting to me to think about it, like what it could do to influence the way that we build again, because so much of what we do now has gone away from that.
We as the, you know, the building industry and that, that includes a lot of people who are not architects at all, right? Who don't care about design at all. Right? And there's a lot of contractors out there who don't give a
crap about design, right? They'll, they'll deliver a building any day of the week because that's what they do, right?
But, but they don't really love design, like architects, love design, uh, and totally generalizing here, but it, it, it is, i, I kind of do hope for that, right? Because people go to Italy, right? You can go to Venice and you can visit the
Duomo, and you could [00:28:00] go to Rome and you could go to Florence and you can go to Paris and you can visit these cathedrals.
And there's something about those that is absolutely incredible. And yet in the buildings that we built today, there is nothing like that for the most part and except for the, the, those very small percentage of capital A architecture that actually gets built.
Ben Guler: right. So I think, I think I am in apparel to how you're thinking about it too. Where I think the problem so far has been supply demand and it's much inex, more inexpensive. And even now it's very expensive to afford a home, even if it's built from like, you know, a, you know, a cookie
cutter one. You get to yeah, you get to pick the tile, the paint on the walls, and you know, maybe the Not even the exterior because you know, it's, it's, let's say it's a, a neighborhood where it's HOA or something like that. So you, you
have pretty minimal selection and even that, you know, will cost you a lot. So a lot of that has been kind of the system that systems that we have right now have been optimized for supply in.
Man. It's like, I just want the cheapest thing but you know, [00:29:00] the biggest size maybe like, you know, the three bedrooms and, and, and two baths. But I wanted to pay the least amount for that. Okay, well then you have to get the same, you know, module. 'cause even our builders could build the same thing. Yeah. We and nowhere and it's, you know, there is an economy in
that of, of scale replicating that. Uh, but it could come up with like, if there's a consensus and there's enough, I think momentum of people wanting more, you know, aesthetically pleasing things and are more in tune with that because they have this technology more accessible. It could be that other products or other tools could come about that makes it accessible.
Like, okay, well no one is
asking for those kind of details. And we've even lost the art of even knowing how to make those, to a certain degree, uh, to that same level. I mean, it was possible before, but just we, we haven't done it. So, uh, that could be productized to a certain degree. I mean, there's a lot of manufacturing in the building industry that have, you know, have automated and made things so much simpler because of, you know, what we prioritized and what the demand was for. And it's a lot faster to build things. [00:30:00] But, um, yeah, there's not really that need from the customer at least expressed with the current value that it's being offered. Like, oh yeah, that would cost twice
as much. Oh yeah. No. That twice as much. No, obviously I don't
care about. Yeah. That's not
for, I'm not, I don't fit in that category of, of people could, could afford something like that.
But if that becomes more affordable, it could be something that, that, uh, you know, just society demands.
Evan Troxel: Yeah. And I think it has something to do with just the overall. I, I hate to say marketing, right, but
it, the kind of what it is, it's just like what is in people's mindset? Like what are they seeing, what are they, what, what is coming back at their senses? Is it, is it images like this or is it the images that most architects already produce, which is like, because of that, supply and demand architects start to go down.
Those, they, they kind of set their boundaries that they go
down and they're gonna be like, well, you know, they're making decisions on their client's behalf to say, okay, well I, I kind of don't think that they're gonna go for that, so I'm gonna dumb [00:31:00] my idea down to this. Right? Or I
think, I don't think they can afford this, so I'm not even
gonna ask them that question.
I'm gonna design it like this. And, and now to me, like there, because it doesn't cost anything really to produce these images, it's like, You can throw out these, these ideas and get a
reaction, but you can also inspire somebody to get, get those, those, I don't know, just to me, it, it, it is kind of this interesting chicken and egg and like we've, we've produced this end goal and now how do we build something to that?
How do we, how do we build the technology? How do we build the products? How do we build the, the construction details? How do we build the implementation of all these things to actually make those things happen? It is a really interesting kind of like, are we, are we pushing forward or are we pulling from the future?
Right. Uh, pulling everybody forward into that. It's a, it's a very interesting paradigm shift that we're seeing happening
Ben Guler: I, I would
not underestimate, yeah, I would not underestimate like the, [00:32:00] someone's like aha moment or when they get inspired like a client. I mean, we, I mean,
architecture firms do that all the time in competitions, like, you go, you shoot for
the moon, like
that's the, you're trying to get an emotional response and a connection, like it's almost illogical.
Like I really love that. I can't even use words how to describe why, but I really love, I. That idea or that, that concept and, and how that's put together. So kind of instilling that. I think it's, it's a human thing, I think, where it's like you kind of, almost kind of, it's so you get so attractive, like illogically to, to certain ideas like that.
And if you could actually generate some of those ideas to see if you could, maybe that person doesn't respond that way to those certain things. So, or, or, you know, you could kind of, it doesn't cost you as much as it did before to even explore something like that, uh, to get a bit more extravagant. Uh, and so like as a designer, that I think that's becomes very invaluable because then you could just kind of like, okay, I do have certain things about program.
This building is the right size, the massing is right, the site, you know, it works with the site based on the orientation and all that. So all that [00:33:00] knowledge is instilled there. But then, you know, you could play with kind of, you know, details at what level of detail you want. Do you want it just micro details or do you want it all the way to like almost change the masking type of details, like subtly.
So I think, I think
that's a really interesting point about. Like just making that ability more accessible and uh, you know, kind of having that discourse with the client to be able to just make quicker decisions and allow, you know, at least the vision. 'cause until you see it, even us, I know we're like, I mean my, my background's in architecture and we visualize buildings, you know, we could look at a flow plan and you understand the flow plan. Clients don't do that. I mean, we had time to, you know, flex that muscle, to learn that like an image does so much more than, than
anything that could be like a two D drawing that we could produce otherwise.
Evan Troxel: Yeah, super visual audience listening to this podcast, super visual profession. And so to your point, I, I agree with that. It's like that is the, the sustenance that so many of us feed on. And so to be able to,
Ben Guler: [00:34:00] Yeah.
Evan Troxel: this is a playground. I mean, it is a tool, it is a playground, and it is, it's something that you could lose hours and hours and hours too.
But is it really losing it if you're doing that exploration, if you're stretching those muscles? Uh, I, I don't think it is. I mean, to me, this is like the best kind of playground there is because it's all about the creative process and you don't need to show everything that comes
back to you in that, in that process.
And it is kind of your job to then curate. The ones that you do want to use and to move forward with, just like any other design process like this is actually a pretty incredible tool. I'm, I'm more, I'm curious about the kind of the implementation. So I've seen some videos, definitely can't assume that everybody has seen the stuff that you guys have posted on your website and on LinkedIn.
But can you kind of just spell out what it's like to use the product for people? I mean, just, I've seen stuff from a very rudimentary SketchUp model, right? Just boxes stacked on top of each other [00:35:00] and giving some pretty cool output to stuff that's a lot more detailed. So maybe just talk through what it's actually like to use a product, how it works, why, how somebody would go about using it.
Ben Guler: Yeah, sure. So, um, let's say you're, uh, in Revit, let's say, and you are just having kind of like you modeled some five walls and, uh, you know, it's very massing stage. In the massing stage, uh, we have kind of one slider for like geometry or override. So if you max that slider all the way up, it tries to, you know, it will do more aggressive and more progressive kind of things to that, that base mass.
So it would add windows or it would add a lot of detail that doesn't exist there based on what you've written in your prompt, basically. And, um, so yeah, so that's kind of like the earliest on where you just kind of have blocks, like boxes, uh, and then, you know,
Evan Troxel: give an idea of like, what a, what a prompt you would write into there. What, what would that be like?
Ben Guler: Sure. Like, uh, like a modern mill, uh, minimal building, um, with the curtain glass [00:36:00] wall or something like that. Just you want to
kind of spell out some of the materiality that is in there, uh, and maybe some color palette like, you know, with, you know, green or orange or aluminum, a c m panels or something like that.
You could kind of specify that
and you would kind separate them in there and then
you would render that. And the way it works is, you know, every time you render, uh, uses a new seed, so then you could render that same prompt like 10 times and you might get like,
Five good ones and five, yeah, exactly. Every time, uh, you could lock the C two, but I'm getting into details now.
So if you wanted to keep that and just wanna, let's say you like an aesthetic and you wanna just wanna push, pull or move around, change your mask, but you like that aesthetic, what does it look like if I change my geometry? You could keep that aesthetic and you just play with the mask itself. The geometry, not the aesthetic.
Evan Troxel: I think it's really important that you actually said that, because I think a lot of people wonder if I find, if I find a, a look, I like, I don't wanna lose
Like that. I, a lot of people don't even know what's possible here. So it's, I think it's important that you brought that, that up.
Ben Guler: You wanna save that look and then, or actually working on a whole system that [00:37:00] lets you kind of bring in a bunch of other ones and automatically like, uh, set the UI from like other renders that you've done in the past, which is really cool. Um,
Evan Troxel: Okay, cool.
Ben Guler: but that's kinda like that, that, that's kinda like the main slider, really. The top one at the very top. And then if you lower that, the more you lower it, the more of the, uh, models being retained, basically. So if you go all the way to the max, it's like, you know, you'll get all the, even the line details. So if you have like Certain molecule ions, like designed a certain way, uh, that will come through.
And all, all that it would do is the, it will just kind of, uh, you could think about like a, like a, like a sketch. It just fills in the colors that colors that in for you, basically, and colors within the lines because you've, you've constrained it to that level of degree. And then basically that's where it's start starting to be used more of as a render engine, where it's like, okay, I'm just rendering.
I want my geometry. I've already modeled, I spent time to model my door to look that way, but I don't have a good wood texture that also has a good reflectance with this other material by it. And I, or I don't have the time to spend to do that 'cause I'm gonna change it to metal next because I'm try metal [00:38:00] and wood right away and they have different like, you know, levels of reflectance or something like that. So that kind of gradient lets you kind of, depending on what stage you are when you're trying to create a rendering, lets you kind of, uh, uh, do that basically, depending on how much detail you wanna preserve from the model.
Evan Troxel: So this is all kind of coming from a global perspective, which is . no matter where you are in the design process, you could be really early on super basic massing, or you could be a lot further along maybe DD level. You've got your curtain wall panels defined. You've got your overhangs, you've got thicknesses kind of correct, and you could kind of adjust that slider down as you move along.
Right? The more and more decisions you make
in the modeling process, you wanna retain those decisions and just have it become more of a pure rendering engine the further you move along. And yet at the same time, like you don't have control over this wall is wood, right? Or this roof is white. You don't have that kind of a level of [00:39:00] specificity when you're, when you're prompting it, right?
So you still get a lot of variation
showing up in the results that are coming back to you throughout that process. So I, and I think what's interesting about this is kind of just kind of setting the, the expectation in people's minds about what this tool is good for.
Instead of expecting it to work like another tool you've used in the past, because it doesn't,
and this is a new kind of tool in the toolbox of a designer.
And I think that's, that's interesting, right? Because this shifts the way we approach design and decision making as we move forward, depending on even who's driving that process. I mean, you, like I said earlier, you could have a client coming to you images that they've used to make decisions about what they want to do as a, even a starting point, right?
I think it's, it's kind of fascinating to think about how this shifts our approach in the design process based on these tools that have come along,
Ben Guler: Yeah, that's really [00:40:00] interesting. I think that's definitely, I mean, reminds me of, uh, when I was working at one of the first architectural firms that I worked at, we still kind of did that, but in the manual way, we had like a
magazines and we're like, okay, our clients like these
five ones and we make like
a design booklet and, and like, okay, okay, these are the great, great images. And then, you know, we walk through, if you like this one, do not like this one. And then per that we would kind of come up with a sign together. And so,
uh, it's kind of similar to that, but way more automated.
the technology's, uh, you know, really updating really rapidly. So I. Uh, like things that you've mentioned, like that would be a possibility in the future.
Like, let's say if you do want this wall to be, okay, this wall is gonna be wood, but then I don't know which kind of wood I want. Like, okay, I want
like oak or, uh, you know, white oak or red oak or, you know, you could actually just have that granularity. Like, I don't, I know the category, but I don't know specific one or the sheen or the paint on the walls.
Like, you could just have that kind of micro uh, uh, ability to, to, to finesse those things. So, yeah, like you said, I think it's kind of like, uh, [00:41:00] already getting really close to like spreading that entire gamut of like, you know, early SD through like dd
like, I want to just try something new in this little space here.
Evan Troxel: I think it's super, super interesting and I do think we are just scratching the surface. Like you, you've said it several different ways. The cat's out of the bag, the tip of the iceberg. We're scratching the surface. The toothpaste is out of the tube, but we're not going back to the, you can't get the toothpaste back in the tube.
We can't hit undo on this. It's already here.
And it is gonna be interesting to see how the development moves forward to have the designer's wisdom specify certain elements look a certain way. In this process I do. That's all gonna come.
Right. But for now, where we are, I think what's super interesting about this is, is how it's changing the way in which we can produce imagery design and inspiration along the way.
It's, it's pretty crazy to think about and I think one of the things that really [00:42:00] blows me away about the image generating. Through, through this rendering process and where I actually can influence what the outcome is gonna look like based on geometry that I build as a, as a designer, versus even some of the other stuff that we've seen where it's like Mid Journey
designed this building
and now people are looking for ways to reverse engineer that back into a three D model.
Like we've seen kadeem and, and different tools like that come out over the past couple years, right? Where, where it's like, go full on to the end and then reverse engineer that back versus us kind of helping that process move forward. We're making our informed decisions on form and space and adjacency, and then we're, we're using Veras to visualize that.
I think what's super interesting to me though is just how realistic it looks.
Right? I've always been blown away. And, and, and what's crazy about that, because I also have a visualization background. Like you, I was always really into rendering and modeling
and materiality and lighting and translucent and [00:43:00] transparency and reflection and specula and bump maps and normal maps and displacement maps and
like, okay, like, I, I get it.
Like I learned how to be an expert in all that stuff. And because these models were trained on photorealistic imagery and photography of real world applica, you know, real world outcomes or outputs, real world being kind of a, a loaded term there, right? Because it could just be a rendering.
but it's like through, through visualization over the decades, right?
It's been, train was used to train these models. So what does glass look like in front of metal panel? What does glass look like with lights behind it? On or off? What does it look like at this time of day? Like I never, ever expected that to be the way that this would turn, because
I've always controlled that.
I've always had to a computer scientist, basically, to understand how [00:44:00] materials work and look at materials in a day-to-day basis. Like I'm looking at my wall right here and
I'm looking at the texture on the wall, and I'm thinking, how would I reproduce that in three
D? I never once thought it would be like, oh, well you just trained the computer to do that based on all the other images in the world that have ever existed, and now it can just do that.
That's crazy to me.
Ben Guler: Yeah.
Evan Troxel: Absolutely. I wonder how you guys think about all that.
Ben Guler: No, I, I think there's, I, I remember, uh, so back in, uh, uh, university, uh, I remember when I was learning like, uh, uh, Maxwell Render, which is like a
unbiased rendering engine, and it was, it's still,
Evan Troxel: I, I love to
Ben Guler: it's really good. Even now. It's, it's one of the best, I mean, they claim it's the best, most physically accurate. And I remember I would start when I was learning that and like, 'cause you could layer some, you could make some very complex materials, like the material editor
is, you could do so many things, like you were saying, displacement maps and
an atrophy or whatever. Like you want to all those different things, spec maps, uh, I would look at materials, like how would I [00:45:00] like look?
I would, I would
how would I classify this thing? Yeah. Like, I
know this is actually reflective. 'cause Y is reflective, actually reflects y they color it y and you learn about materiality and the physics of materiality and all that science. And then like, you, like I would look, oh, this, this table here has like this very gentle sheep with a high roughness or roughness or something like that.
like, I would like, oh, and then what would I do with the texture layer? Oh, I would put that, that would be like a, I have to make sure it's seamless and all that. And like, so I,
I, I remember when I was learning that I would look around a lot and like start to kinda like, oh, what's a extra shell call? How would I, and you know, I would think about that.
takes like, that's resources and that's like dedication to learn all of
those things. And, uh, architects, like, you have to learn that and code and like, there's so much to learn and even more to learn as like, you know, you know, codes grow and, and civilization grows and, and things. More things are invented, new types of buildings exist and things like that.
New, new, new functionalities exist in those civilization. So it's just an ever Growing thing. Like this is like the shortcut to that. Like, well, what if [00:46:00] you don't have to learn how to do all that stuff?
And it's like, ,like, we are learning it. We got it. We did a good job. Uh, I mean, I, I still think there's a space for that, for like physics simulation.
Like that I think will never go away. But same thing with like, you know, painting, like that's didn't fully go away, but that was like the way before photography. That was the way that you would, you know, kind of get a portrait of yourself and it was limited to a certain class of people that would be able to afford that. Uh, in a similar vein, I think this, this technology has kind of has a pattern parallel, uh, uh, pattern there.
Evan Troxel: Yeah. Yeah, I, that's a, that's a good point. And, and I also think that there's like this misunderstanding about what it took to become a visualization
expert, right? There's obvi, I've had the guys from Neoscape on this podcast a couple times who are visualization experts to the nth degree.
So much so that that's all they do.
And there's people who work in firms where that's all they do, because it takes
a level of commitment, dedication to that craft to be so good at it. [00:47:00] It's interesting to me that I, I kind of assume the people who are making the decrees, who are kind of freaking out about this, uh, it or may, maybe they're not paying attention to it enough now that I say that out loud, but it's just like they don't fully understand what it takes
to produce images.
And there's always been this battle in architecture between visualization people and, and designers or project team people, right? Which is, the design's always changing. We're always modifying geometry up to the last minute, and they're, and the renderings are due right now as well, right? And so why aren't the renderings done?
Ben Guler: Yeah.
Evan Troxel: is no easy button there, right
there, there just isn't. If you really, and, and tools have come along to make that easier, right?
N scape, a live link between a model and a and a good enough rendering in real time
is absolutely incredible. And that is also kind of skewed perception of, of. [00:48:00] What it takes realistically to produce a great image and it has democratized images basically to the point where they're free.
Right? All I have to do
is spin the model around, change the time of day. Maybe I mess with the clouds and boom, I got a new image and it's in incred. It's, it's fantastic.
And now we're taking that even farther and I wonder like what this does to our process, how do we need to be adjusting our process to take advantage of this rather than be scared of it, right?
Rather than, this is another tool in the toolbox and thinking about how we adjust our workflows. How are people thinking about that enough? What do you think? I mean, you guys make a tool that makes this easy for
people that democratizes access to it in a tool they already know how to use because they're already building their model.
They're building it in SketchUp or Rhino or Revit. And now Veris comes in and says like, we can, I. What does this look like in the snow? What does this look like on the beach? What does this look like? Uh, you, you have a lot more [00:49:00] sky's the limit kind of possibility when it comes to rendering now, uh, in a moment, right?
I can say, what does this look like in the snow and what does this look like in the summer? And I can do that at the, with the shift in the prompt. How do you think that the, you've said it before, maybe the answer is, you know, just, just thinking about the, uh, what was the word that you used? You, you, you talked about kind of the, the essence of it boiling down to us being able to, uh, you know, use our time more wisely.
I, I, you, you said something to that, to that. It's like, but, but how do people actually need to be thinking about this and how it shifts, how they approach, uh, visualization as a piece of the puzzle when it comes to delivering a project.
Ben Guler: Yeah, so I'll, I'll say this because it's, uh, when I was a bid manager, this, you know, was happening in the previous room that I was working at. When scape came out, like that was a game changer for us. Like
before scape, we always went through three s max and we
brought in the Revit model and we did all the textures, we textured everything. [00:50:00] And obviously you could do a lot more in through this max. We sometimes, if we had the budget, we'd add animations, add VR walkthroughs so people could walk in there and like animate things.
But that was like a one week lead time sometimes,
or three, three day lead time where it's like, okay, well you have to finish your thing and then in three days is a deadline.
Evan Troxel: you have to tell the designer to stop now. Like,
no, you really have to stop
you. You can't give me an updated model anymore. Right. Because
pencil's down now we get to
work. Right? And, And, if you change the design, we have to redo work. And, and that has shrunk down to like nothing. Now
Ben Guler: Exact.
Exactly. So like with like when, back then when scape came about, like what was an interesting, uh, discovery is like, oh, well, with scape we're okay with not having that highest quality. It's
good enough because
I'm staying in the same environment and I can move this door five feet over. And it's, I could see it, it's, it's, it's fine. And that was kind of like more of a tool for the designer. Even so the des like all our designers are like, no, not their [00:51:00] render. They could render stuff like we could, you could produce
the render. Yeah. We don't have to wait on that person that's full-time
doing just that. And that person might have other jobs like queued up like, I'm doing rendering for this client, this one, then you have to get yours done, basically.
No, no. Everyone has access to this tool that, you know, could, you could use as a design tool. Even this is a further step within that same vein where it's like, well now you could get really good results like photo realism. And not only that, but you could explore as you're designing and moving elements around, you could get inspiration.
It's like your digital Pinterest superimposed onto your like three D model, so
Evan Troxel: fully integrated.
Ben Guler: Right,
Exactly. And it's,
it's, and so you could do, you could do all those different, that required way more skills for different specialists to do within, you know, you could just continue designing and you could visualize it for yourself and for, you know, uh, uh, for others that you wanna show it to clients and things like that. So it's kind of within that paradigm where it's like, like it's the same thing, but in a different way. You put it, it's like,
it's really compressing that, [00:52:00] that allows like one architect to do a lot more basically without
learning a whole spectrum of tools and, and, and techniques, uh, on their tool belt because they have already so many things that they have to learn.
Evan Troxel: I, I've had the Inscape guys on this podcast before and we've really talked about it as a, what, what was a game changer about Inscape beyond just like images happening in real time, was that it actually becomes a tool to help make
decisions. Right, and, and so therefore, now a rendering engine is a design tool.
And that was a paradigm shift because before that, it was rendering comes at the end.
Like that is something
Ben Guler: Oh, yeah.
Evan Troxel: visualize things at the end of this milestone, not during this
milestone. Are you kidding me? Like when I learned how to three D model we modeled in wireframe, it wasn't even shaded, right?
We had to do it because the graphics cards couldn't do anything.
And so now, I mean, paradigm shift, we make decisions by looking at this in a much more high fidelity manner visually throughout [00:53:00] the process. And now that just got taken up a notch. A big, big, big notch, right? Which is, you don't, you can model like the most basic thing and you can inves type in a prompt and get something that actually looks like a building, whether it's realistic or not, but it's like, okay, I like that.
I'm gonna go in that direction, that piece of that. I like that piece of that. I'm going to explore that.
And it becomes, A prompter for you. I mean, it's like a circular prompt now. You're prompting it. It's prompting you
Ben Guler: Right. 'cause you get
Evan Troxel: And I think that is an incredible paradigm shift once again that we're kind of dealing with and figuring out.
And so when it comes to like this, how do we change our workflow? How do we address these technology changes as we evolve together? Technology and architecture? Maybe that's it. Like we're prompting it and it's prompting us back. And I think that's what's kind of exciting about, about this is if we think about it like that, all of a sudden it becomes a [00:54:00] tool that I can understand how to use it.
Because we have talked about the lack of control that you have in this
situation, and you guys have a slider, .Right,
Geometry, retention, are we going to keep the, how much of my model do I want you to respect versus not? And by just playing with that slider, it's, we're kind of telling it how much to prompt us back.
how much have we decided that we wanna
keep? How much have we decided that No, I'm open. I'm, I'm, I'm open to seeing what you come up with here based on everything you've ever seen, everything you've ever been trained on. I think it's, it's super interesting paradigm
Ben Guler: Yeah, and I'll mention one quick thing. We're about to release, uh, uh, another feature coming soon where if you wanted to have variable, like, okay, this is set, but this is that set, like have variable amount of like, respect this a lot, but this, that, like I'm playing
here, but you could maybe play a little bit here. So like we're, we're constantly like, you know, like improving things and, and trying to add [00:55:00] uh, uh, different features that kind of complete that story of synergy where you're prompting back and forward at the level of levels of degree and where in your design have you not made those decisions so that you could continue that dialogue.
Evan Troxel: That's interesting. How, where did that come from? Does that come from your users? Is that coming from you guys? Where? Where's that idea?
Ben Guler: Yeah. So, uh, we have our forums. So a lot of people have said like, I wish I had, like, the, the whole geometry of a ride was, uh, multiple times. People said like, well, can I just actually have this? I don't want to just kinda explore things. 'cause then it's just a toy where I'm just getting a lot of ideas, but I'm not that at that stage of the, the progress. So that was, uh, a very, you know, concerted effort to like, okay, try to really, okay, how do we hone that in, try to get the best settings and the best kind of setup so that we could deliver that. Um, you know, and Rhino evidence sketch up. And then this is another portion where people actually have talked about that.
And, uh, there's,
I think in our forms we have like all, like, we start to log like all the different things into like one master kind of list. Like, okay, these are, [00:56:00] people want this, this, this. And then we kind of rankor it in our roadmap and then we just kind of like, okay, this is, this was a thing that was asked more than a few times, so we gotta
get this one next so that we just prioritize things like that.
But yeah, that's. Definitely like, uh, and we've also heard that feedback from a lot of times we have like demos with clients and we talk to them and we see how they use it, which is really cool. It's such a, like anyone out there who's like, you know, writing software, like just seeing someone else using your tool is so eye-opening.
It's like, oh,
like we've, um, bill had a call the other day with someone who like wasn't getting the right results and uh, the app was like, oh, this is just crap. Like, when are you guys gonna make the good version where it's like gonna give you the, you know, the good results. So from that, within 10 minutes ka to like, oh wow, you guys are like killing it.
Like, what's going on here? Like, why didn't I know This is a great tool. So like, just kind of seeing how, how your end users are, are, are, are, you know, using the software is such an eye-opener because you could kind of, you know, configure and sculpt your UI such that, you know, it allows people to get to that journey. But [00:57:00] Yeah.
a a lot of these, most of these features are kind of like things that people have, have brought up either on
calls, demos, or, or forum. So,
Evan Troxel: Right. I, I wanna give you a chance to talk about some of these other things a little bit, at least to tease 'em out there. Um, but before we do that, I just have one more question about the Veris and the AI rendering is where, where do you actually see this going in the next year? I mean, to give people an idea of, with your very, you know, boots on the ground, you're really grounded in, in the, the research side of this and what's actually, you're obviously producing a product that is taking people on this journey.
What do you see coming that you could maybe tell people about?
Ben Guler: Yeah. Year's. A long time in this space.
Evan Troxel: I
Ben Guler: that's a lot.
Evan Troxel: why I only said a year. I mean, I
Ben Guler: Yeah. No, I, I, I'm pretty confident that, uh, again, mid journey's kind of the lead, uh, leader in ability of quality, like text to
image, just amount of quality. It, it's almost in indistinguishable from, from photographs, [00:58:00] uh,
even, you know, very beautiful outfits.
Uh, I think, uh, We'll be there within a year for like, you know, your model and you can get those kind of, that kind of quality from your, just whatever you have in your model to the degree that you've modeled it. Uh, I'm pretty confident that we will be there. Uh, all the things that are being developed are being developed such that are, uh, there're smaller footprints so that we could actually lower even the amount of like VRA and hardware that it takes.
So, even faster, I think. So speed. We're gonna see that. And, and high resolutions. They're being trained in high resolutions and high resolutions. So all those things are, are, I think are gonna be coming. I think that's a given. Uh, within a year. Uh,
we're also looking at, uh, kind of bridging some of our tools that start to kind of, and this will be the beginning of how do we kind of start to, and we've already seen this in the industry a bit from other people that are explore, exploring with this.
But how do you get some of those ideas that are being, uh, rendered? Back to your model as [00:59:00] metadata.
Like let's say
if I'm just rendering, uh, this and I'm, I'm high, you know, do not override my geometry. High, high geometry, retention. Well, you know, like, I wanna keep those materials. I really like those materials.
How do you generate those? That's pretty trivial to do. So I think we'll see
more tools where it, it starts to bleed beyond just rendering. It's trying to like, okay, if this decisions were made, like I don't want to manually, it goes back to like the
Comput, comput, computational Yeah. Designer.
It's like, okay, this is just science.
You just take this input and connect it this input and generate things like that. So we'll see. Kind of
Evan Troxel: are all really practical. Those are super practical things, right? It's like you get an image and it's like, okay, yeah, I, I, I'm gonna go with that, at least for now. And I, I'm gonna say, yep, take that wood, that's glass, that's this color of metal, whatever, and, and actually just get that back into my model without me having to do the manual labor to catch it back up to that point.
I think that totally makes sense.
Ben Guler: yeah, yeah. Sit that.
Evan Troxel: Well, let's talk about some of your other stuff. Let's talk about, uh, glyph and you, you mentioned, uh, [01:00:00] morphous and Helix. Just give people an idea of like the suite of what Evolv is, is offering here as far as your, your kinda latest products that are coming out. I, you are always interested in saving people time.
That to me is, is the big deal like that If you could put it all under one banner, that to me is what it's about, which I think really speaks back to your BIM manager heritage and your, your visualization heritage, which is like, you see how much you come from practice, you understand the labor involved to get to these points.
So maybe let's just put it all into that bucket so people get it. But now you can break it down into the individual components. Uh, let's start with Helix.
Ben Guler: sure. And I'll add one other thing to what you said is because we're from the industry and, and, uh, uh, like you said, we're, we're from, we're trying to just automate things. Uh, it's also that like we're, um, How do I put this? So basically like a lot of times, like, because we're grassroots, [01:01:00] we've had a, uh, uh, you know, have had to do these things manually, uh, with the tools that we've done is, um, I guess I recently listened to your Speckle podcast and they hand the operability. It was just a good segue to Helix, but basically, uh, we didn't build a tool that does everything for everyone.
We're, we're seeing like we're cherry, we're asymmetric, is what I'm trying to say, where,
uh, we're going, where the people are. So like if people are on Revit a lot, we're, a lot of our stack is developed out of that because that's where the people are. If people are on SketchUp, we, that's why we made those two things because they wanted to solution for that. So, uh, a lot of other times our approach is a bit asymmetrical because of, we come from the industry and because we're trying to kind of address the consensus of people, how they kind of graduate towards different tools.
So again, that could have been more
succinct, but, uh, ,uh, anyways, so
Evan Troxel: that totally makes sense because I've, I've wondered myself, like, why did you guys develop
Helix to bridge the gap between SketchUp and Revit? And you're basically answering that question.
[01:02:00] You're saying that's where the people are, this is the thing that people keep asking for.
Ben Guler: Yeah. Yeah. So yeah, let's start with Helix. So Helix actually started, uh, I think, uh, in 2016, uh, way before I joined, actually. I joined in 2019. And so that was like a version one of Helix, and that was really kind of stemming for a lot of, uh, so Bill was a, uh, working with, um, you know, as a big manager in the past and had to have done that problem.
And he tried to get people like, just don't use SketchUp, like, go to this other thing. And he failed that battle. So like that was kind of, it
stemmed from that. Like, I, we really, I wish I had this, and as an end
user, and let's just see if we can do something about it. So that's when, that was the nascency of, of that tool where, uh, it got started. Uh, we actually put a, we took it off the market for a while. We had some issues with it, and then, uh, we kind of paused development on it. We didn't have, like, resources to invest in that tool for, for a little bit. And then, um, later after I joined, I think within a few months we kind of revamped that and we started developing this.
That's when we released the version two of Helix, uh, which kind of took away the file format. Uh, we used to have a [01:03:00] Helix format that you kind of use, go back and forth. Uh, and then we just kind of like, no, it's just stream data from, from SketchUp to Revit and Revit to SketchUp. And there's a lot of users in both of those two tools that you want to kind of, uh, talk to.
And what that allows us to do is have kinda this asymmetrical approach where it's like, okay, I just want to get stuff from SketchUp and to Revit with textures and UV mapping. So the textures are correctly like aligned. Like everyone wants that So like, as you know, different APIs are, uh, being opened. We built that essentially so that you could get that.
But then again, we also want walls to be walls and roofs to be roofs. So the very first version of, of, of Helix was that everything was vilified, essentially. Like everything in in SketchUp that is a wall, uh, wall or, or roof or things like that actually came in as a native. Roof element, a native wall with the slope and everything so that it, it's as if you modeled it Revit, that was
always kind of the substrate where we wanted to really make sure that that's a foundation.
We started there basically. And so the challenge [01:04:00] we've had with that is like, how do you get people to map those things? So we start to automate some of those things and try to like, okay, let's expand and, and allow some, you know, let's say you have a mannequin that you have for your store. You don't want to have to have to have that parametrically done.
But if we were to bring it in as a blob of, you know, a mesh blob, how could we do it such that it's the best way, the most BIM way and not polluting model and slowing down and adding size. So we've, you know, given the best practices of how we could bring certain content into Revit, and then we also could bring Revit content back into SketchUp.
But that's, that leg is not as well developed since there was a lot, a lot of tools out there that already does that, and it's not such a
hard challenge to do. Actually, we, the biggest challenge
was getting unstructured data from SketchUp into structured BIM data into Revit, and I think we have a pretty good solution for that.
Evan Troxel: Nice. And I guess the question becomes, does, has it actually forced people to model any better in SketchUp to begin with? Because that's, I mean, You think about the transition from SketchUp to [01:05:00] Revit in any firm, and it's like, this is fantastic because it's a chance to start over, right? And actually build the model with a new set of insight, right?
Because a lot of times when people are designing in SketchUp, it really is a clay model. It really is loose. It really is not accurate for good reasons, right? I, and I think that's one thing that people who say you should do everything in this tool, usually Revit don't understand, is that design process needs to be loose and messy and this and that and non-structured and
all of those things. And therefore, like, don't put those constraints on that person because if you were to, you were not gonna get as, you're not gonna get as interesting of a result as one example of a word that I could have used in that sentence. Right? So, uh, I, I, I, I'm asking though now, like, has this actually kind of.
Created a new standard for which people need to model two in SketchUp to get good results into Revit so that you actually do save that time, or is that still kind of where it's always [01:06:00] been?
Ben Guler: I think it varies. Um, more power users are more, you know, kind of cognizant of that because what it does is if you, you could Model things way faster and sketch up like large masses, moving things around. And we've
even had a, a client who was using it like he was, uh, he had a deadline like the next day and it's like, okay, I'm gonna give Helix a try if it breaks. 'cause there was not enough time of the day to finish to model all that and rev it, uh, with levels and everything, and curtain walls and all that. And like, I have this thing, it's already masked. If it breaks, I'll miss the deadline. If it doesn't, like, I'll be super happy. And it worked out, you know, so like all, all the, you know, the components moved around and copied it over and then mapped everything as, as Revit, uh, elements. Brought rope into Revit and was able to come cut planes and sections and plans, stuff like that. So that was really great, but it really does vary. That's, that's a great success, success story of someone who, you know, knows both softwares pretty well, but then we see users
that just know SketchUp really well and just know
Revit really well and people [01:07:00] in SketchUp really like that flexibility, which makes sense.
Like I actually love SketchUp for a lot of house project that I do. Like. It's very flexible, easy to pan around and things like that. And I still love Revit because like I've scanned my house, so like I have a layer LIDAR scan and I could like check everything, make sure it's ified, I can move things around as as much as I want to. Um,
but there's still a huge benefit of even in, in that middle where if the SketchUp is not that structured, it's still a much better starting point. 'cause you could have dimensions
and you could easily like stretch your flex. Like, hey, this is, they didn't respect the big coursing. So it doesn't add up to a full break at the end.
Okay. Well, You know that, uh, as you know, more of a engineer architect, where you're gonna just move that over two inches and then you could, and in Revit to just move those, those kind of subtle movements to make things accurate. If you already have the content, there is, uh, I would like to start from that instead of drawing every single thing and make the heights, and make all the levels and, and connect all those things.
If that's already kind of, if you start from, from
something that's already there. Uh,
so it still helps out with that process. Like it could cut down like [01:08:00] something that could take, you know, eight hours, 16 hours to like two, three hours. It still work, but yeah, uh, uh, it's, it's much more, uh, it's quicker.
Evan Troxel: All right. So Helix in a nutshell gets geometry from SketchUp to Revit, and it bies it, and I, you know, super useful for, for the right audience. Definitely. All right, so let's, let's jump to Glyph. Talk about how Glyph
can save people time. What's
Ben Guler: Sorry, one
more thing on, on just a side thing, quick thing with Helix, uh, we also have AutoCAD actually. So, but that one's more focused on like two D
plans, so, yep. So like, it could detect like double lines and kind of create walls and doors, endemic heels, walls. So we also have something for, uh, AutoCAD Links and Revit.
Just quick thing, but people are interested
Evan Troxel: No, I think that was worth bringing up because, uh, I mean, basically it's a automatic tool to go from two D AutoCAD to three D in Revit,
and I mean, that's. Again, if you're starting from that point
versus having to draw all those and snap to the underlay and bring in the underlay, and what do you do?
People [01:09:00] explode that, I hope
not, right? All those things, just, just bring it in and ma just build, build a model. That's pretty
Ben Guler: Yeah. Uh,
Evan Troxel: All right,
so talk about glyph. What's glyph do?
Ben Guler: sure. So Glyph, the whole vision with Glyph was like, um, we have, it's all, uh, it's a Revit tool all within Revit, and you have the Revit model and you have metadata in that model. And so how can you automate the, uh, uh, documentation process? So that's kinda like the, the goal, the target.
Now, it does a lot of it already, but it's not, you know, a hundred percent what we, what we want it to be. We have a lot of development to do on it still. But basically it's like it forces you to host more data in your Revit model, because the Revit model informs how the, the documents are being generated
automatically, generating views, sections, dimensions, tags, creating sheets, placing views on sheets, all within presets that you could kind of create.
And those presets could be actually collated into one kind of grouping, which we call a bundle. And you could press that like your SD set [01:10:00] bundle. So then the SD set will take your SD model that you've just generated. I'll put all the views for you based on, you know, the schema that you've defined before.
Like, I want this, this is how I usually do my SD set. I have four views, I have four elevations. Let's say I have, these are the settings. And you pre-configure that. And those configurations can be shared across, you know, different models. And that's kinda like, uh, What Lyft does. Um, the o like, the under underlying thing is like, how do we automate more and more? Uh, so when I was a BIM manager, a lot of the times the, the, uh, templates were like subtractive. So a lot of people kind of have that. So this is how kind of Lyft augments this, this piece of the kind of rev models. You would just delete mo uh, views or you kind of have like 10 stories and you delete seven of them.
'cause you only have a three story building. So the, the whole idea with Gly is that it's an additive approach. So you can have a very thin Revit template, and then
because of the additive approach, you just add on, you know, create views for you don't subtract them. As, you know, APIs were more exposed and it allows us to do a lot of that more automatically. So that's kind of
a, [01:11:00] maybe it's a long way to put it.
Evan Troxel: that's cool. I, I, it's again, a, a big time saver. Uh, and I, I think about like, obviously architecture's deadline driven and if you can spend more time on the model and automate the sheets and the layout, the placing views on sheets and tagging things and dimensioning things, anything like that. I mean, uh, the n f all you have to do then is make a few adjustments.
Ben Guler: Right.
Evan Troxel: You can spend more time doing the thing you're really good at, you're not even good at
valuable at, which is the
architecture, not the drafting. I think like the, the, what I'm so interested in these tools is around that kind of idea, which is stop, you know, trying to compete with other firms on how good of a drafter you are.
That's not what
we should be doing. We should be spending our time on the architecture side. Right? And so that, I mean, this is a tool that I can, I think definitely helps in that regard.
Ben Guler: right.
Evan Troxel: Anything else about Gly that you wanna throw in [01:12:00] there?
Ben Guler: Uh, cool thing that we'd done, uh, a few months ago was collision detection for annotations and dimensions. So that's pretty cool actually. We're pretty proud of that one. So basically like if, if you have overlapping like tags, they wouldn't, you wouldn't have them overlapping because they actually, the tag scores, the white, white space available. So that was something that we spent some significant resources to figure out. So, uh,
Evan Troxel: Nice.
Ben Guler: people should check
Evan Troxel: So even less adjusting
stuff after the fact because yeah, I mean, you can't have a drawing with overlapping tags. They have to be
Ben Guler: kind of some of the feedback we got too, where it's like, well yes, this automates that. Like, you know, the tag all in Revit, but that's more work that I have to delete. It's more work than actually just put putting the right ones. So in the effort to kind of automate it and accurately, you don't want to create more work for yourself.
'cause that's kind the nuance you want to actually be what you would wanna place as a designer. So, um, yeah.
Evan Troxel: Yeah. Fantastic. All right. Last tool. What's it called?
Ben Guler: Morphous.
Yeah. So Morpheus is, uh, stems from a lot of different projects that we worked [01:13:00] on, uh, with, uh, a few architectural firms and some, some construction companies actually, uh, where we're trying to optimize things. And, uh, also from some dynamo scripts that we've done in the past where like, it just kinda optimizes for certain, you know, sectors or, or verticals. And so what it, what it is is, I, it's actually a pretty flat tool, but it uses, uh, uh, generat design algorithms to handle things like adjacencies. So it has these things called modules and paths. And basically the way you, you draw, uh, your design together, uh, it structure respects those rules, those constraints.
So if you have competing constraints, uh, like a multi-domain, you know, uh, optimization solver, it tries to kind of keep, you know, those adjacencies correct. Or if you wanna have like a point of interest and you wanna put, you know, those on your site, it tries to move and, and, uh, uh, adjust the, the design based on that.
But you control the, um, you control kind of the footprint. So you could say, okay, I want the footprint to be, I want this to be my area of play. Which could be like a field region or a room or a area. And then within that you can subdivide it into more [01:14:00] rooms, uh, or, you know, other objects. So what it does is it has these different modules that you could bake into native, uh, uh, rabbit elements.
So once you, uh, say you have your, uh, design set up, let's say it's in an office, right? Each module represent maybe a, a room or an area, and then it could be represented by a few different furniture. Like if it's a conference room, I have a conference table in there, or you know, other objects in there. So if you group that into a Revit group, then you could map that, you know, low res block in, into, uh, that module.
And then as you kind of map those things, you could make a design very rapidly and then click bake, and then you get your actual content. So to keep that in a more succinct way, it's like a super duper cool array tool that has like intelligence that does things intelligently. So you can use it for like
office layouts, uh, auditorium layout,
seat layouts, uh, like units for like housing.
You could Map it to a link or something like that so you could like just array those things. So that's kinda what it is.
Evan Troxel: Interesting that it's working at different scales there.
I [01:15:00] think that that's unexpected, right? For, for a tool like this, because I think most of the time people are thinking about this from a, when they think of generative design and, and kind of auctioneering standpoint, it's like at a very high level, very, you know, it's like site level,
not like a room level.
And I mean, we've seen examples of both of those, but, but I, I, when it comes, when it hits me, I guess I should say, it's like I, I think about like massing or I think about orientation on a site, or I think about things like that. And not necessarily like theater, seating layout in an auditorium kind of a thing.
So that to me is pretty interesting, is that you can use it at different stages of design as you start to dial things in. You can run it different areas in your building as you've already made decisions on the, the whole thing. I think that's, that's pretty clever.
Ben Guler: right? Yeah. That was a lot of feedback that we've seen from people that are using, there's a lot of tools out there that are kind of outside of Revit that do this already, like automating, you know, flow plans. And so a lot of those [01:16:00] things, a, a lot of those tools are kind of, uh, they're great. Uh, but again, back to earlier something that we've talked about, we made a very intentional decision to be right within Revit to do that.
You don't have to go to another app, actually use the same controls. You already know the nav system. All you have to do is learn this little sector of, you know, how, how these elements are, are structured and then you're have everything else, you know, without having to make that leap or inter interoperability. And so because it's right natively in Revit and it's real time actually, it's actually very smooth the way it works. Like we are making it very accessible. And also the, the reason we've seen it this way and the way we, we built it this way is we've seen a lot of dynamo things that we've done in the past for, for clients that were very, like, it's pretty much a complex array, like a P array or, but with a lot of logic like orientation and all that.
And so we wanted to kind of. Add that or, or relate or allow that kind of functionality that you would get with something that's a lot more complex to build on your own in a simplistic, you know, accessible way.
Evan Troxel: [01:17:00] Fantastic. I mean, that's like quite a suite of, of tools and I, it's fun watching these releases come out. It's fun watching the little teaser videos that you guys are putting on LinkedIn to kind of show people what's possible. And, uh, I mean, it's, it's, it's great to see your commitment to developing tools to make people's lives easier in a, a world of tools that are actually pretty difficult to use, right?
you, we've talked about visualization as being like an expertise, but all of these tools take a level of commitment to, and dedication to create, uh, that expertise in those applications so that you can be efficient in them. And . It's like recognizing that not everybody can be that in every one of these programs.
So democratizing it so that one person can do more actually doing less right with the right tools in their tool belt is is pretty fantastic.
kudos to you guys for doing that.
Yeah, it's very cool. I'm gonna put links to where everybody can find you in the show notes, but I'd love it if you just say [01:18:00] it out loud here and, and let everybody know where they can find out more about what you're working on and and where to go to find it.
Ben Guler: Sure. Yeah. So, uh, if you just go to our, our website, uh, evolvelab.io, uh, there's a app page apps, and then you can see all the different tools that we have. And then we also have the forums, uh, which are, are really great resource for, to see what's kind of going on. And all the releases are posted there. Uh, so that's forum.evolvelab.io. And so yeah, those two links are pretty good for people to
Evan Troxel: Nice. That'll do it. Well Ben, thanks so much for sharing today. I appreciate it and it is always fun talking to you.
Ben Guler: Thank you. Thank you for the invite and, uh, looking forward to talk to you again.
- Ben on LinkedIn
- Ben on Twitter
- EvolveLab website
- EvolveLab forums
- EvolveLab on LinkedIn
- EvolveLab on Twitter
- EvolveLab on Instagram
- Related episode: TRXL 006: ‘Doomed to Fail’, with Bill Allen
- Confluence podcast