Virtual Texturing | Live from HQ | Inside Unreal

Virtual Texturing | Live from HQ | Inside Unreal


>>Amanda: Hey, all!
Welcome to this week’s news and community spotlight. From Turtle Rock Studios, the creators of the original
Left 4 Dead comes Legend
of Zelda-inspired VR title, Journey of the Gods. With the beautiful and
wondrous World to explore, Journey of the Gods
released earlier this year to rave reviews. With the game held
in such high regard, we found out
how they handle worlds of varying size and scale,
where the player takes on the roles
of both human and God in VR. Find out more about
how Turtle Rock optimized Journey of the Gods
for Oculus Quest and why Blueprints
was integral in capturing their unique vision. Created in Unreal Engine,
ZipView simplifies real estate by offering
a high-quality virtual look into residential
and commercial buildings. Olim Planet’s Director
of Team Platform, Mark Kim, talks about
how the product is shaking up Korea’s real estate industry
by providing businesses and clients a fully immersive
3D VR environment. Just to add a couple
quick reminders, our teams will be out in Seattle and New York
for our annual Dev Days and the first ever
Unreal Indie Dev Day. You’ll be able to hear
about the Unreal Engine roadmap, preview our
upcoming open World features, or get need to know info about the Unreal ecosystem
and best practices. Sign up today
if you haven’t already. Lastly, don’t forget
about our Unreal Film Jam. We want to see
your short films and animations based on,
“Oh, the places you’ll go.” Submit by October 19th
for your chance at incredible prize packages,
including five grand, personalized project reviews
with Blur Studio, the creators of Love, Death &
Robots, and Epic Games, and more. Working on your submission?
We’d love to see your progress, so share with us
on social using #UE4. Next up is our top weekly
Karma earners: Adnoh, ClockworkOcean,
Ham1ton, KristofMorva, Shadowriver, Livada,
DanielOrchard, Ikav, Mattxm and still making
the list is Everynone. Thank you all so much
for your contributions. Y’all are phenomenal. And now for my favorite,
our community spotlights! This outstanding short
is called Duty. It was created to test out
Unreal’s ray tracing capabilities. This piece leverage Assets
from Infiltrator and the Open World Demo and was built using
NVIDIA GTX 1080. We’re floored
by the fidelity of this and hope you enjoy it too. If you’ve been looking to
implement water elements into your scene,
Hristo has done a simple breakdown
in a Twitter tutorial. With only 49 instructions
and one Texture sample, this cheap caustics solution
can add a lot of life to the water in your project.
Check out his full breakdown. Last up this week is
this gorgeous 16th century French Baroque bedroom, depicting the abduction
of Persephone, an ancient story
from Greek mythology. Sangers went in depth
about storytelling, Material creation, and touched
on lighting and time management in an interview
with Experience Points. So, we do recommend you
seeing the full breakdown. All right, thanks for tuning in.
Have a great week everyone.>>Victor: Hey
everyone and welcome to the Unreal Engine
Livestream. I’m your host,
Victor Brodin and today we are going to talk about
Virtual Texturing in 4.23. To do this, I have invited
Ben Ingram and Jeremy Moore. They are Graphics Engineers
here at Epic Games. How are you guys doing today?>>Jeremy: Great. Thanks.>>Ben: Pretty good, thanks.>>Victor: Awesome. Don’t
have much more of an intro, I think without further ado, let’s just dig right in
with Ben’s presentation.>>Ben: All right.
Is screen share working?>>Victor: Yeah, it’s working.
We’re good.>>Ben: Okay. So, Virtual Texturing
is a new type of Texture streaming that we’ve added
in Unreal Engine 4.23. The basic idea is,
it can take your Textures and divide them up into tiles
of a fixed size, typically 128 by 128 pixels, and then as you move
throughout the World, we’re analyzing what is
actually visible on the screen and looking at the tiles that are required to Texture
those onscreen elements and then efficiently streaming
in just the tiles that are actually required to
show what’s visible on screen. This is an improvement
over traditional mip-based streaming in a number of ways. The granularity is much better; when you’re streaming
a full bitmap, you need to stream
the full large image, whereas with virtual
texturing you can just load in the actual smaller part
of the image that is visible. Anyway, in order to set that up
with Unreal Engine, you’d need to go
into your project settings. So first off, it needs
to be checked right here. There’s a couple
of other options that you can customize
into more detail. Normally the defaults are fine. Here you can turn on virtual
Texture for lightmaps as well, if that’s enabled
when you do light bakes, it’ll use Virtual Texture
streaming for light maps as well,
tile size, border size. These normally
you don’t need to change and then here you have
some additional options for compressing Virtual Textures
to save disk memory and then once it’s enabled,
using Virtual Texturing is as easy
as opening a Texture and just clicking this button
here to enable it. You don’t really see
any difference, but you can see here the method
is now set the virtual streamed instead of regular streamed. One interesting thing to help
visualize what’s going on, there’s a console CVar;
you can set r.VT.borders and set that to one. So, then using
Virtual Texturing, you can use Virtual Textures in a Material just the same way
as you use regular Textures. Like we can create a Material
from this Texture. Open it up, you can see here
there’s a little VT that designates
that this is a Virtual Texture. So right here, this is the thing
that’s different between regular Textures and
Virtual Textures and Material. The sampler type is set
automatically to virtual colors. So, you can see there is
these virtual sampler types that are only available
with a Virtual Texture and that’s basically it.
If it’s set to virtual color, you have a Virtual Texture
plugged in and it just works
like a regular Texture. We can hopefully —
in order to see this here, we can make a Texture
in the World. Here we go. So now this is the virtual
Texture borders here. You can see each
of these squares represents a tile of 128
by 128 pixels. You can see right now
we have roughly 256 by 256 of Texture data loaded to show
this surface of the cube. But then as we zoom in
and we get closer, it starts streaming
in more data. So now we’re up to 1024
by 1024 or sorry, 512 by 512, one, two,
three, four — four tiles. Then you keep going, it automatically loads
in the higher resolution tiles. If we’re off screen here, so these tiles that would be
over off the left of the screen that aren’t visible,
the system will know that those tiles aren’t needed
to render this frame so they will not be streamed. Or, if you had
some sort of occluder, like something in front
of part of this cube, those tiles as well
would not need to be loaded. That’s the true advantage
of Virtual Texturing. It only needs to load based
on what’s actually visible and you’re seeing and there’s no distance-based
metrics or the things that you
typically require with basic, regular Texture streaming. A couple more things here about
the efficiency of the system. Basically, when you have
a Virtual Texture sample, it does cost more than
a regular Texture sample. It requires
more GPU instructions. There are actually
two Texture samples that occur under the hood
for each Virtual Texture sample, and then the additional
adds and multiply instructions. One interesting thing here — We could make
another Virtual Texture here. So again, if we want
to make this virtual, just goes to set it to Virtual
Texture Streamed, that’s it. Now, I have a Material that’s
looking up to Virtual Textures, the Diffuse Texture
and the Normal Map Texture. You can see both of these
Textures are being sampled using the same UVs. There is this thing called
Virtual Textures Stacks. A Virtual Texture Stack is
any number of Virtual Textures that you sample in your Material
using the same UVs and basically
that can help amortize the cost of Virtual Textures. You basically need one actual
Texture sample per stack, and then an additional Texture
sample for each Virtual Texture. Right now, this is actually
costing me three Texture lookups, one for the stack and then one
for this and then one for this. But for example,
if I change this to — it’d be sampled with
a different UV, now we can see
we’re using two stacks. This is one stack,
now this is its own stack, and now we’re using
two Texture samples here and then another
two Texture samples here. This is mostly transparent. You don’t really need
to worry about it, but just when you’re
planning out your Materials, trying to combine as many
Virtual Textures as possible, using the same UV. Like if you’re looking up
a number of Material components like base color, metallic, specular, whatever,
using the same UV for all of them
will be the most efficient way. I think that’s pretty much it
for the basic functionality of how Virtual Texturing works.
I can pass it on to Jeremy to talk about
Runtime Virtual Texture.>>Victor: We had one question that I thought
is pretty relevant here. “Is there no way to mass set virtual
texturing for all Textures?” I think they’re asking if it
exists in the bulk edit option?>>Ben: Yes. I believe you should be able
to select multiple Textures. There’s also an
interesting thing — There is this Convert to Virtual
Texture Tool, so if you right click on a
Texture, there’s this Convert
to Virtual Texture and if you click this, this will convert
this particular Texture to Virtual Texture, but then it will look
for all the Materials that use that Texture
and then modify the Materials to be properly
sampled from Virtual Texture. It’s actually recursive process so it’ll look
for all the Materials that referenced that Texture but then it will also look
for all the Textures that are potentially set
as parameters on those Materials and then convert those
to Virtual Texture as well and then again, Materials. So, if you have
an existing project where you want to try
out Virtual Texturing, you can just select
a whole bunch of Textures — Oh, I have a Material there,
that’s why, with a whole bunch of Textures,
Convert to Virtual Texture and it will convert
all the Textures and then automatically
recursively convert all the Materials as well. The only thing that has
to touch the Material in order to modify
this sampler type to properly reference
in Virtual Texture rather than the regular Texture. I guess that’s actually
one more concern or consideration as well. If you have a Material
that’s using a Virtual Texture it’s a parameter, then any Material instance that
wants to override that parameter will also need to use
a Virtual Texture. You can’t mix
and match Virtual Textures and non-Virtual Textures within
a single Texture parameter.>>Victor: That’s great and that
totally works. Cool. Let’s see if we can get
Jeremy’s screen share over.>>Ben: Oh, yeah.
Let me give that up.>>Jeremy: Okay, here we are. I’m going to speak a little bit
about Runtime Virtual Texturing. That builds pretty much
on the same tech as the Streaming Virtual
Textures that we’ve just seen. But instead of streaming
in those Virtual Textures and tiles from disk, we’ll fill those in
at runtime using the GPU. There are a few ways
that you can think about that. You can think of
a Runtime Virtual Texture as a huge procedural Texture, so we don’t need
any storage for it, we just run some procedural
algorithm in the Material and fill in Virtual Texture
just in time using that and take advantage
of all the benefits that we’ve just seen
in Streaming Virtual Textures, where only the tiles
that are visible on screen that we actually want
to sample from, we pay to fill. Another way to think about it is
that a Runtime Virtual Texture is a huge render target
in the World. Again, we’re just writing to it
just in time for the areas of that render target that we want to sample from
and reuse in other Materials. The third way I think about it
is that it’s a shading cache. Maybe we have a Material
that’s very expensive to render but it doesn’t change
from frame to frame. The output from it doesn’t
change frame to frame. It’s not camera dependent
in any way. A Runtime Virtual Texture
allows us to just render tiles of that Material’s
output into a Virtual Texture which acts as a cache that
we can then read from later on. That kind of mental model of a
shading cache fits really well with something like a landscape
in a project where landscapes can be
really quite expensive to render because they tend to sample
from a lot of Texture layers, they sometimes do things
like tri-planar sampling, have a lot of interesting blends and maths going on
and the landscapes tend to cover a large part
of the screen as well, so they can be
quite expensive to render. So, performance
optimization here rather than
a memory optimization is that we can render that
landscape Material to a cache, which is the Runtime
Virtual Texture and then all we have
to do frame to frame is sample from that cache. Whereas Streaming Virtual
Texturing is kind of a memory optimization
where we pay a small amount in terms of performance
for using it. This is more of
a performance optimization where we’re going to pay
for having the physical cache. As a demo for the for the idea
I’m going to use the landscape mountains sample
that comes with Unreal and I’ll convert it to use
a Runtime Virtual Texture. I’ve already gone through
the step of modifying the project settings
as we saw Ben do, and I’ve made a duplicate of the
basic landscape Material and assigned it. We could see right now with the Shader Complexity VP mode — just wait for that
to warm up a little bit — that the terrain is quite
expensive to render, it’s in red and the aim
is to reduce that cost. The first thing we’re going to
need to do is you’re going to need to create
a Runtime Virtual Texture. We can do that
through the Materials and Textures context menu here and this gives us some settings
to fill in for it. The first one is the size
of the Virtual Texture. In 4.23, the maximum size
we can go to is 256K. That’s not really big enough for
the application we have here. We have a four-kilometer
sized World, I think, so this kind of size
of Virtual Texture gives us a textural resolution
of about one and a half centimeters,
which isn’t really sufficient. Obviously 4.23 was beta, we’re dogfooding internally
and improving things and we’ve improved
some of the memory restrictions in terms of page
table size and so on. So, we can go above this size and you should see that
in later versions of the Engine. But just for this demo,
we’ll stick here. The other options in terms
of Texture cell size and border and so on
are similar to the ones that you’ve seen for Streaming
Virtual Textures already. The other interesting
option here is what we are going to store
in the Runtime Virtual Texture. We have a few options and we’ll
extend these as required, as we see more functionality
required in future versions of the Engine. But we can go
for just base color, base color and normal,
or a base color normal roughness spectrum
is what we’ll choose so we’re able to do a rough
kind of PBR, render with it. Also, we enable
a Texture compression and this means that when we render
to the Runtime Virtual Texture, we’re going to compress it in BC
format to reduce memory, size and to improve
sampling performance. So, we’ve created that and now
we need to also place it in the World, because we need to define
where the Virtual Texture is placed in the World
and where things render into it and what the projection
rendering into it is as well. To do that, we need to create a
Runtime Virtual Texture volume. I’ve just dragged
that into the World and assign a
Virtual Texture to it and now I can resize and place this wherever
I want in the World. As a quick helper
for this as well, what we really want here is to have
the Runtime Virtual Texture cover the entirety
of our landscape object. We can use this shortcut here
when we pick the landscape and copy its rotation bounds, and then that should
transform it so that we can see now that it’s
covering the entire landscape. That’s the bounds in orange of the of the Runtime
Virtual Texture there. Now we have our Runtime virtual
Texture in the World, we need to actually start
using it in the Material. This is the landscape Material and right now we’re doing
a whole bunch of work and then piping that output
to these Material attributes. What we’re actually going to do
is, let me just — what we’re going to do is use
this Material in two contexts. The first context is that we’re
going to use it for writing to the Runtime Virtual Texture,
writing into the cache, and then the second context
is that we’re going to sample from that Runtime
Virtual Texture, and then pipe it out to the
Material attributes as normal. So, we can imagine this Material
is actually going to be compiled twice,
once for each of those contexts. The first context is to write to
the Runtime Virtual Texture and so we have a node
which helps with that, which is the Runtime
Virtual Texture output. We can take all of
the current outputs and pipe them there instead.>>Victor: I think we
might have lost your screen share there, Jeremy. I think we’re back. We’re going back to reset up
just here in a bit because the pro-license
that we actually have didn’t apply
during our meetings. So, but we got
another seven minutes. So, let’s run through this
and then we’ll set up a new call if you have
some more to present. We can see you now,
so it’s all good.>>Jeremy: We can continue
now, and did I just lose
the normal there? I think I did. Let’s just undo.>>Victor: It’s a nice little
feature to be able to do?>>Jeremy: It is indeed.
Thank you, whoever invented that. Now we’re piping
all of those things into the Runtime
Virtual Texture, and then the second
context we will need is to read from
the Runtime Virtual Texture. So, we have another node
for that which is the Runtime
Virtual Texture sample node. This is similar to the
other Texture sampling nodes, but it’s special case
for Runtime Virtual Texture so that we can pull
the different attributes out from a single node rather than creating multiple
Texture sample nodes here. We need to assign our runtime
Virtual Texture to it and you can see that, just like
a normal Texture sample node, it needs to have the sampler
context that it’s going to use and this should match
what we set up on the Runtime
Virtual Texture originally. You can see the options
are the same. That would need to match and then we’ll pipe
those outputs to here. You can see there’s one thing that we can’t pipe
into the Runtime Virtual Texture because there’s no
space for it here and that’s the emissive color. We could keep that
and that’s fine. We can do any logic
we want in additional to sampling
Runtime Virtual Texture. We could do things
that are non-camera dependent in this area here, so it’s fine. But I’m actually going
to remove it anyway, just so that we can see a better
performance comparison later. So that’s good. We’re all set up
and I’ll apply that. Now if we go back
into the landscape what we’ll see is that, it’s not
appearing correctly yet, and there’s one reason for that,
and that’s everything that renders
into the Runtime Virtual Texture needs to declare
that is going to do that. So, all primitives that render
into the Runtime Virtual Texture have to specify
which Runtime Virtual Texture they’re going to render into. There’s an option here on most
primitives, Virtual Texture, and this is the list
of Runtime Virtual Textures that can be more than one
that we want to render into. So, we’re going to add to that
list, choose the first Texture, and here we go. We can now see that
we were getting some output. It doesn’t look so great,
but that’s because there are a few issues
that we need to fix as well now. One other interesting item here
as well is that we also specify the passes that we
are going to apply. For landscape, we want to render
it into the Virtual Texture and then we want
to sample from it in the main pass as well. So, you can see that
we’ve chosen Virtual Texture and main pass here, but there are some options
where you can choose parameters just to render into the virtual
Texture for example, if we don’t want
to sample from it. The first thing
that looks a bit off is the Texture tiling here,
looks completely wrong. The Textures are tiled
much more densely initially and there’s a reason for that. If we go back into
the landscape Material, there’s a little bit of a clue, in that there’s a note here
which has far blend distance. Most landscapes do a trick
a bit like this where according to the distance
from the camera, some of the components
in the Material are changed. So often we change the Texture
tiling over time or over distance or whatever, and that’s happening
in this Material. That could be problematic
for a Runtime Virtual Texture because when we render
the Runtime Virtual Texture, there is no context
of where the camera is, it’s completely
camera independent. So, distance from camera
doesn’t really make sense. But there is an alternative
that we can use and that is to use the mip level
of the Virtual Texture that we’re rendering to
at any given time. That’s not exactly a one-to-one
match with distance from the camera
because there are other things that control
which mip level we choose, but it’s a pretty good
approximation in most cases and good enough to use for something
like a landscape shader. I’ll just show
how that would work. Somewhere deep
inside this Material, we have this Material function and it’s used
in a number of contexts and in this Material function
where it’s basically saying, “What’s the difference
between the World position and the camera position? What’s the distance
to the camera?” And this is the thing
that we need to have an alternative approach for when we’re rendering
to the Runtime Virtual Texture. There’s a node,
the View Property node, and inside that View Property
node there’s the Virtual Texture
output level. So, this outputs zero mip and then effectively
further away from the camera, the number gets higher. Again, it’s not exact
and it’s kind of depends on your screen resolution
as well in your final output. If you want to do this
really correctly, you’ll need to pipe that into
the Material too as a parameter. But there’s a rough
approximation for this. What we can do is do some kind
of — it’s logarithmic, so the distance
according to mip level; so, we can pipe out
the mip level property into a pair of two.>>Victor: Hey Jeremy, I’m sorry
to interrupt you. We’re going to have to just
re-setup the calls so that we can get the login
for the pro license.>>Jeremy: Okay.
I’ll stop the share.>>Victor: That is all good. This should be
a pretty quick one. I’m going to send you guys
a new link. We have a question
in the meantime that I can go ahead and ask you. They were a little curious about
the sort of overall performance costs for Virtual Textures. They were mentioning
that the instruction counts seem a little high and then specifically
if this is something worth utilizing on consoles in the
current Unreal Engine version.>>Jeremy: Right. I can take that but Ben may
have some other opinions to me. Yes, there are
a number of instructions to sample the Virtual Texture. We need to transfer my UV
into the physical UV after sampling the page table. Then there’s some —
in Runtime Virtual Texturing, there’s some
decoding cost as well. There is some instruction count
cost and performance costs and the Texture redirection
as well. So yes, but in terms of
current gen consoles, I think depending
on your use case, I think it’s appropriate for use
on current gen consoles and certainly other — There are a number of AAA games
out there running on console that do use this kind of
technique in their own engines.>>Victor: All right.
I think we’re back on the call and now I believe Jeremy, you should be able
to grab the screen share again.>>Jeremy: Alright,
can you see that?>>Victor: We do, we’re back.>>Jeremy: We should really pay
for the license, right?>>Victor: Well, we did and now it’s actually a loaded
and we’re on it.>>Jeremy: Alright, perfect. So, we do some kind of hackery,
multiply by another constant, this is probably
roughly good enough and then one thing that is worth
doing at this stage is, remember that this shader is potentially used in more
than one context, either for writing
to the Virtual Texture or for any other context
writing to the main pass. It’s a useful to actually
have a node that tells you
which context you’re using and allows you
to switch between them and that’s this Runtime
Virtual Texture replace node. We want this solution to happen when we’re out putting into
the Virtual Texture and this solution
in all other cases. So, we apply
that and with a bit of luck we should see the tightening
of the Texture has gone down
to something that’s appropriate. We can see that the tiling
of the Texture over here is a bit larger, which is
sort of the effect we want. Another issue
that we need to solve. Here, this is some mountains
in the World, and we can see
that we have striations that have gone
a little bit wonky. They should be straight
and the reason for that is that by default when we render a landscape into
the Runtime Virtual Texture, we render it in a single quad
for performance reasons. So each landscape component will
be rendered as a single quad. The problem with that
is if you have any Material that’s using interpolated data
height or normal to provide some outputs, then you won’t have a high
enough quality interpolation to give a good result. If you have anything like that,
you need to allow higher MIPS or higher LODs of the
landscape to be rendered because right now
we’re clamping it to zero just to get the most performant
results out of the box. So, if we just change
that to something — six is a pretty good setting — then you’ll see that
that now fixes that issue and things look
as they should there. So we go back to here. I think that’s it. The only other thing I’d like
to fix right now before we go any further is,
I said earlier that the resolution
of this Virtual Texture isn’t really good enough
for this size World. So that’s something
we’re working on internally to provide a much
better solution for so that you can use much bigger
Virtual Textures on your World. But right now,
it’s not in this version. I’m going to just shrink down
the Virtual Texture volume so that we still have
the quality you want while we’re looking at
a few of the features that you can apply with
the Runtime Virtual Texture. To do that, and I’m just
going to select the volume. Let’s go to it, I’m going to
make it quite a bit smaller, divide it by eight
or something like that. You can see that outside
of the volume the valleys are now
being clamped to the edges of the volume
where we’re sampling it. I’ll put it roughly over
where I think we were working. Now we should be getting
a reasonable kind of quality result. Obviously, we can use that
for the landscape shading. Let’s just check whether
the performance is improved. Now we can see that things
are looking better. It’s no longer red
so that’s a win. There are things other than
performance improvements that you can get out of this.
One of the good things is that, it doesn’t have to be
the landscape that samples
from the Virtual Texture. This can help
with setting things well into — combining things
in your World. So, if you’ve got these rocks
down here that are fine, they’ve got a Material
that looks pretty good but you can see that, they don’t
really follow the World, the landscape shading.
We look at that shader, we can in here also add
a Virtual Texture sample node. We can, from here, sample
the Runtime Virtual Texture, push that out into Material
tributes node. We’re done there.>>Victor: I think your bandwidth
is a little slow at the moment. We’re actually seeing
a still frame again, so let’s just see
if we can let it catch up.>>Jeremy: Okay.
>>Victor: There we go. I think we’re back.>>Jeremy: Okay, great. Montreal is a long way away. If we apply that, we can see
now that these rocks are actually sampling
from the landscape. They sync up perfectly
with the landscape. One thing you’ll get
if you do that is that you’re going to get
some stretching on these because Virtual Texture
is a projection so it’s sampling from it
in a top down projection way so on these sharp edges,
you’re going to have a problem. A good technical artist would have a number
of solutions for that, I’m not one of those,
but something you can do is just blend according
to the normal. I’ll just show that. Let me blend between
these two things. Just go into a
normal set and apply that. That doesn’t appear
to be working.>>Victor: That happens all the time.
>>Jeremy: Oh, here we go. That’s the reason. You can see now those kind of
harsh seams have gone and yet stuff still
sets into the scene nicely. There’s probably 101 uses
for that kind of thing. You could use it for grass
or whatever so that the grass matches the terrain color or — there’s a bunch of things
you could do though. One other thing I just wanted to
show you as well is for roads. So, roads can render into
the Runtime Virtual Texture. We can have things that render
into the Runtime Virtual Texture and then is picked up by anything
that’s sampling from it. That’s pretty easy
to set up as well. If we look at this road,
it has this Material I think and we could create
a Virtual Texture Output node and push all this stuff to it.
But there’s another option which is simply to set
Material domain to Virtual Texture
and apply that. It will continue to work
as normal in the main pass so we don’t see
any change there yet, but now it’s available for us
to use in a Virtual Texture rendering as well. If we go to edit this road
and we choose spline tool, select all of the segments, we can see that the road
segments have render to a Virtual Texture option. You can render to a
Virtual Texture here and you’re rendering
into the Virtual Texture you probably don’t
want to cache; I’ll just turn that off. What we can see here now
is the default for a road is that we render
into the Virtual Texture or into the main pass. So, if Virtual Texture
is enabled, it’s going to render
into the Virtual Texture and then not render
into the main pass. You can also set it to render
into the main pass too if you want to resample
the Virtual Texture in the same way
that we do with the landscape. But this is
a good enough option. Now we can see that this road
is actually being rendered as part of the landscape
now in the main pass. We’ve rendered the road
into the Virtual Texture and then it’s being pulled back
for us in the landscape shading. And because we’re blending like that into
the Runtime Virtual Texture, that gives us a few more options
for what we do on the edges here because the road previously
was being masked but we can go back
to that Material and now we’re allowed
to use blending modes. So, if we pull out the alpha
from the road and pipe it into the opacity, you can now see that
we’ve got much better blending between the road
and the landscape, which is really kind of cool. And we can move that around
in the landscape just to prove that it is being actually
sampled from the landscape now. You can see that
it’s actually now kind of a projected thing
onto the landscape. This whole thing actually
works really well with the new open World
landscape addition tools that we have where roads
can now morph — sculpt the landscape as well. Things start to work really well
as these systems combine. So yeah, that’s that. There’s one other thing
I just wanted to show you. There’s another system
that can render into the Runtime Virtual Texture,
that’s the foliage system. This is kind of done because I don’t actually have
any cool decals on hand to actually render
into the landscape. You could imagine stuff
like cracks on the road or similar things to that. Instead, I’ve just got this mesh
which is some rock gravel. If I open that and
find its Texture and I turn that into
a Virtual Texture — something that’s capable of
rendering to the Virtual Texture. I go back and again,
the same options down here somewhere
for Runtime Virtual Texture where I can select it to render
into the Virtual Texture here. Now I can —
oh, that’s the wrong one. Well, that’s a weird thing. It seems to be a putting
the wrong thing down. Nope, I don’t know
why it’s doing that. Oh, hang on.>>Victor: I think if you
uncheck the check box.>>Jeremy: You’re
absolutely right. Sorry. Those aren’t appearing. Oh, I know what it is. There’s another option
down here somewhere, which is the priority. So, because we are
blending things into the final Runtime Virtual Texture,
they need a sort order. The landscape is
sort order of minus one. Everything else
defaults to zero. So, the road is already at zero. If you want this to appear
on the road, we need to give it
something suitable. It’s still not appearing. I’m going to give it one more go
before I give up.>>Victor: Don’t worry. I spent
three minutes eventually realizing that I had exposed a Class
and changed it on the instances
of the Actor in the level and that was all done live
on the livestream, so they’ve all seen worse.>>Jeremy: Nice. I didn’t apply.
There we go. Okay. You can see I’ve scattered
these things everywhere. They’re not actually Mesh rocks. They’re crazy, strange
projections of Mesh rocks down into the Virtual Texture. They look horrible, but you
could imagine using this system to add other really cool decal
features into your landscape. So yeah, that’s it.
We’re continuing to work on it and improve, both in terms
of performance and feature sets above the 4.23 beta. We’re dogfooding
pretty intensively here. So hopefully, this will provide
some good solutions for the licensees.>>Victor: Yeah, it was really
cool to just see you move the road across the landscape
and seeing it blend seamlessly using that.
That’s really neat. We do have a couple questions. Oh, I had one actually
I was curious about, so you were using a rock Mesh
as the source Asset when you were painting, right? But then due to the setting
that you changed, that’s actually what causes
the projection to happen rather than actually
rendering the rock Mesh?>>Jeremy: Yeah, that’s right.
What’s actually happening there is that when we render
the Virtual Texture, when you’re rendering the rocks
into the Virtual Texture, but then of course that just
flattens everything out because it’s just being rendered
into the render to target and then we sampled back
from the render to target. So yeah, we were rending the real rocks
into the Virtual Texture but when we sample out, we just
have the flattened version. One of the cool things
you can do is you can match up like a bit of geometry and some sort of blended
like Texture representation of that geometry and you
render the blended Texture representation of that geometry that you want into
the Virtual Texture and then you allow
the geometry itself to just sample back
from the Virtual Texture. That’s one of the ways that
you can work with this stuff and that allows you
to kind of LOD out the geometry
over a distance, but it still appears
to be there. So, you can use it to support
some LODing techniques too.>>Victor: That’s really neat. Let’s see, I’ve got
a couple of questions. One of the questions were
how Runtime Virtual Texture is supposed to be used
on a larger landscape where the resolution of
a single Runtime Virtual Texture is not sufficient?>>Jeremy: It’s a great question. This is something
that we’re working on. You could use multiple
Runtime Virtual Textures, it’s true, that’s one approach. Although I think that’s not
going to extend very well because in the end, there’s a limitation
to the number of those objects we’ll be able to
support without — and still be able
to uniquely identify them for when we’re getting feedback about which Virtual
Textures need updates. What I’d like to do
is do something like, there’s techniques to this
like adaptive virtual texturing as one technique and there are some papers
on it out there. It was used in I think
the Far Cry games, where you can
effectively increase the size of your Virtual
Texture resolution without hitting the issues
you would come to if you did it naively, which are the page table
blows up to an enormous size so you end up paying a huge cost
in terms of memory for your page table and updating your page table
becomes expensive as well. There are ways to
break that problem and we don’t have any dates that we can give
for that being in the Engine, but it’s something
that we are looking to put in the Engine
at some point.>>Victor: This is a question
I don’t entirely understand, but I think you guys might. “Currently Runtime
Virtual Texture seems to be lacking the ability to have custom content
layout in favor of just three limited presets.
Any plans to address that?”>>Jeremy: Not right away, no. A couple of content layouts
have been added internally, a one cool one
is a World Height layout, which is just a 16 bit
gray scale value that we can interpret as height and you can use that
for blending, like, how far is my object away from
the landscape and therefore, affects on blending
according to that. Obviously,
there are other great ideas that we could also implement. It’d be great to have
some flexible approach, which allows people
to extend it, but right now
that’s not where we are. We’d have to see how important
that is for people.>>Victor: Alright,
another sort of a little bit more generic question. “Can Virtual Texturing be
helpful in bringing expensive UDIM Textured cinematic
quality Assets into Unreal?”>>Ben: Yeah, potentially. I didn’t mention
that previously, but we should natively support
importing UDIM Textures as Virtual Textures. Basically, if you have
a whole series of UDIM Textures with the proper
naming convention, just importing the first .1001 UDIM Texture, should detect
all of the other UDIMs in this same directory and bring them all in as
a single Virtual Texture Asset. Then if you apply that Texture
Asset and a Material, it should automatically scale
the UVs correctly as such that, just your native UDIM UVs
baked into the Mesh and should function
correctly in Unreal. So yeah, that’s definitely
one of the big hopes for Virtual Texturing as it lets you use
much higher resolution Textures because it efficiently
just streams in the data that’s actually
visible on screen and UDIMs are sort of —
that’s current industry standard of authoring
large Texture datasets. So, yeah.>>Victor: That’s neat. Hopefully we’ll be able to
see some more cinematic stuff. Check out the Unreal Film Jam, if that’s what
you’re interested in, that we’re currently running. I think that was it
in terms of questions. I’d like to thank both Ben
and Jeremy for coming on, taking the time out of your day
to prepare your presentations and spending this hour
with me here. Before we leave, I would
just like to mention that we have started
doing transcripts. We’ve actually done captions
for a while on our livestream videos but what we started doing
is that we also take it and upload it as a PDF document
with the timestamps. If you’re looking for a specific
part of the livestream that we’re touching on that
we’ll be talking about it but you don’t want to sit
and scrub through and find it, my tip is to open
that transcript PDF document, control F and search
for sort of the keywords that you’re looking for. That will actually bring up
the proper timestamp and you can go there. Another way that
we discovered last week, actually the person
who is helping us out with all the captions discovered
that you can turn on live feedback of the captions
next to the YouTube window right above the live stream chat
that we also have. What’s nice about that is that
the captions won’t cover up portions of the editor
if we’re doing an editor work and you will also see a sort of
a real-time little highlight of which line is covered and that can also allow you
to click back, “Oh no,
I want to hear that again. What was up at that time stamp?” You can go there and click
at it. It’s quite useful. I use it myself
to find portions of the stream. If you’d like to let us know
how we did today, what kind of topics we might
be covering in the future, I will go ahead and post
our little survey here today because Amanda is out.
So, this is all on me today. We’ve got it on Twitch
and there it is on YouTube. Please go ahead
and let us know how we did. Everyone who enters their email
address into the form will be part of
a little sweepstake where we send out
an Unreal Engine T-shirt. Moving forward, we do Meetups
all around the world. If you’re interested
in joining one, go ahead and go to
UnrealEngine.com/user-groups and see if there’s any near you. If you don’t see anyone
that’s near you considering, I mean that’s relative,
how much you want to drive and you are curious about what it might mean,
organizing one or hosting one, send us an email
to [email protected] and we will give you all the information
on what that entails. As always, we do our three
community spotlights before every stream. We look for them
everywhere. On the forums is probably the best spot, but sometimes I scroll
through Discord, Reddit, they’re all good places.
You can also just send us an email
to [email protected] if you’d like to show us
your work. We are getting a little short
on countdown videos. We’d definitely have used
the one we had today before. That is 30 minutes
of development, fast forward it down to five, send the stat alongside
with your logo and we might go ahead
and spotlight your video as our countdown and if you’re streaming
on Twitch, make sure that you use
the Unreal Engine category so that we can tune in,
see what you’re working on. It’s usually a lot of fun,
helping each other out Then, as always, follow us
on social media. I would like to give a big
thanks to Ben and Jeremy for coming here today and with that said, next week
we’re going to be covering Multi-User Editing
and I did a fun little photo shoot with the guys
at the stage here. I won’t be here;
Amanda will take the host seat and I will be out on Oculus
Connect actually. So, if you’re at Oculus
Connect please shoot me a message
and let’s catch up. Without further ado,
Ben, Jeremy, thank you so much and we will see you
all next week. Bye everyone.>>Ben: Cool, thanks.
>>Jeremy: Thanks. Bye.

Danny Hutson

16 thoughts on “Virtual Texturing | Live from HQ | Inside Unreal

  1. So with this we can prevent problems like textures not loading up fast enough which I noticed in a couple of games made with Unreal

  2. So UE4 has caught up to Crysis' road tool 12 years after its release, better late than never 😉 Looking forward to resolution improvements.

Leave a Reply

Your email address will not be published. Required fields are marked *