replaced some tabs, added copyable spaces to lstlistings

metadata
Wenzel Jakob 2014-02-06 15:11:30 +01:00
parent ba2a6dcaf7
commit 093050f755
17 changed files with 591 additions and 555 deletions

View File

@ -101,40 +101,40 @@ $\texttt{\$}$ mitsuba -c machine:1234 path-to/my-scene.xml
When no port is explicitly specified, Mitsuba uses default value of 7554.
\item \textbf{SSH}:
This approach works as follows: The renderer creates a SSH connection
to the remote side, where it launches a Mitsuba worker instance.
All subsequent communication then passes through the encrypted link.
This is completely secure but slower due to the encryption overhead.
If you are rendering a complex scene, there is a good chance that it
won't matter much since most time is spent doing computations rather than
communicating
to the remote side, where it launches a Mitsuba worker instance.
All subsequent communication then passes through the encrypted link.
This is completely secure but slower due to the encryption overhead.
If you are rendering a complex scene, there is a good chance that it
won't matter much since most time is spent doing computations rather than
communicating
Such an SSH link can be created simply by using a slightly different syntax:
Such an SSH link can be created simply by using a slightly different syntax:
\begin{shell}
$\texttt{\$}$ mitsuba -c username@machine path-to/my-scene.xml
\end{shell}
The above line assumes that the remote home directory contains
a Mitsuba source directory named \code{mitsuba},
which contains the compiled Mitsuba binaries.
If that is not the case, you need to provide the path to such a directory manually, e.g:
The above line assumes that the remote home directory contains
a Mitsuba source directory named \code{mitsuba},
which contains the compiled Mitsuba binaries.
If that is not the case, you need to provide the path to such a directory manually, e.g:
\begin{shell}
$\texttt{\$}$ mitsuba -c username@machine:/opt/mitsuba path-to/my-scene.xml
\end{shell}
For the SSH connection approach to work, you \emph{must} enable passwordless
authentication.
Try opening a terminal window and running the command \code{ssh username@machine}
(replace with the details of your remote connection).
If you are asked for a password, something is not set up correctly --- please see
\url{http://www.debian-administration.org/articles/152} for instructions.
For the SSH connection approach to work, you \emph{must} enable passwordless
authentication.
Try opening a terminal window and running the command \code{ssh username@machine}
(replace with the details of your remote connection).
If you are asked for a password, something is not set up correctly --- please see
\url{http://www.debian-administration.org/articles/152} for instructions.
On Windows, the situation is a bit more difficult since there is no suitable SSH client by
default. To get SSH connections to work, Mitsuba requires \code{plink.exe} (from PuTTY) to
be on the path. For passwordless authentication with a Linux/OSX-based
server, convert your private key to PuTTY's format using \code{puttygen.exe}.
Afterwards, start \code{pageant.exe} to load and authenticate the key. All
of these binaries are available from the PuTTY website.
On Windows, the situation is a bit more difficult since there is no suitable SSH client by
default. To get SSH connections to work, Mitsuba requires \code{plink.exe} (from PuTTY) to
be on the path. For passwordless authentication with a Linux/OSX-based
server, convert your private key to PuTTY's format using \code{puttygen.exe}.
Afterwards, start \code{pageant.exe} to load and authenticate the key. All
of these binaries are available from the PuTTY website.
It is possible to mix the two approaches to access some machines directly and others
over SSH.
It is possible to mix the two approaches to access some machines directly and others
over SSH.
\end{itemize}
When doing many network-based renders over the command line, it can become tedious to
specify the connections every time. They can alternatively be loaded from a text file
@ -155,7 +155,7 @@ For instance, you can render a scene several times with different reflectance va
on a certain material by changing its description to something like
\begin{xml}
<bsdf type="diffuse">
<spectrum name="reflectance" value="$\texttt{\$}$reflectance"/>
<spectrum name="reflectance" value="$\texttt{\$}$reflectance"/>
</bsdf>
\end{xml}
and running Mitsuba as follows:

View File

@ -322,10 +322,10 @@ configuration file from the \texttt{build} directory.
\subsection{Building on Mac OS X}
\vspace{-5mm}
\remarks{
\item Unfortunately, OpenMP is not available when compiling
using the regular \code{clang} toolchain (it is available when using Intel XE Composer). This will cause the following parts of Mitsuba
to run single-threaded: bitmap resampling (i.e. MIP map generation), blue noise point generation in the \pluginref{dipole}
plugin, as well as the \pluginref{ppm} and \pluginref{sppm} plugins.
\item Unfortunately, OpenMP is not available when compiling
using the regular \code{clang} toolchain (it is available when using Intel XE Composer). This will cause the following parts of Mitsuba
to run single-threaded: bitmap resampling (i.e. MIP map generation), blue noise point generation in the \pluginref{dipole}
plugin, as well as the \pluginref{ppm} and \pluginref{sppm} plugins.
}
Compiling Mitsuba's dependencies on Mac OS is a laborious process; for convenience, there
is a repository that provides them in precompiled form. To use this repository, clone it

View File

@ -10,15 +10,15 @@ to become part of the main codebase.
Mitsuba is split into four basic support libraries:
\begin{itemize}
\item The core library (\code{libcore}) implements basic functionality such as
cross-platform file and bitmap I/O, data structures, scheduling, as well as logging and plugin management.
cross-platform file and bitmap I/O, data structures, scheduling, as well as logging and plugin management.
\item The rendering library (\code{librender}) contains abstractions
needed to load and represent scenes containing light sources, shapes, materials, and participating media.
needed to load and represent scenes containing light sources, shapes, materials, and participating media.
\item The hardware acceleration library (\code{libhw})
implements a cross-platform display library, an object-oriented OpenGL
wrapper, as well as support for rendering interactive previews of scenes.
implements a cross-platform display library, an object-oriented OpenGL
wrapper, as well as support for rendering interactive previews of scenes.
\item Finally, the bidirectional library (\code{libbidir})
contains a support layer that is used to implement bidirectional rendering algorithms such as
Bidirectional Path Tracing and Metropolis Light Transport.
contains a support layer that is used to implement bidirectional rendering algorithms such as
Bidirectional Path Tracing and Metropolis Light Transport.
\end{itemize}
A detailed reference of these APIs is available at
\url{http://www.mitsuba-renderer.org/api}. The next sections
@ -33,18 +33,18 @@ this way, otherwise the source code layout will look garbled.
same line to make the best use of vertical space, i.e.
\begin{cpp}
if (x > y) {
x = y;
x = y;
}
\end{cpp}
\paragraph{Placement of spaces:} Placement of spaces follows K\&R, e.g.
\begin{cpp}
if (x == y) {
..
..
} else if (x > y) {
..
..
} else {
..
..
}
\end{cpp}
rather than things like this
@ -61,12 +61,12 @@ have the prefix \code{m\_}. Here is an example:
\begin{cpp}
class MyClass {
public:
MyClass(int value) : m_value(value) { }
MyClass(int value) : m_value(value) { }
inline void setValue(int value) { m_value = value; }
inline int getValue() const { return m_value; }
inline void setValue(int value) { m_value = value; }
inline int getValue() const { return m_value; }
private:
int m_value;
int m_value;
};
\end{cpp}
@ -74,9 +74,9 @@ private:
start with a capital \textbf{E}, e.g.
\begin{cpp}
enum ETristate {
ENo = 0,
EYes,
EMaybe
ENo = 0,
EYes,
EMaybe
};
\end{cpp}
\paragraph{Constant methods and parameters:} Declare member functions and
@ -102,8 +102,8 @@ counting. This is done using the \code{ref<>} template, e.g.
\begin{cpp}
if (..) {
ref<MyClass> instance = new MyClass();
instance->doSomething()
ref<MyClass> instance = new MyClass();
instance->doSomething()
} // reference expires, instance will be deallocated
\end{cpp}

View File

@ -12,9 +12,9 @@ something like this:
\begin{xml}
<?xml version="1.0" encoding="utf-8"?>
<scene version=$\MtsVer$>
<shape type="obj">
<string name="filename" value="dragon.obj"/>
</shape>
<shape type="obj">
<string name="filename" value="dragon.obj"/>
</shape>
</scene>
\end{xml}
The scene version attribute denotes the release of Mitsuba that was used to
@ -35,9 +35,9 @@ Similarly, you could write
\begin{xml}
<?xml version="1.0" encoding="utf-8"?>
<scene version=$\MtsVer$>
<shape type="sphere">
<float name="radius" value="10"/>
</shape>
<shape type="sphere">
<float name="radius" value="10"/>
</shape>
</scene>
\end{xml}
This loads a different plugin (\code{sphere}) which is still a \emph{Shape}, but instead represents
@ -51,55 +51,55 @@ and one or more emitters. Here is a more complex example:
<?xml version="1.0" encoding="utf-8"?>
<scene version=$\MtsVer$>
<integrator type="path">
<!-- Path trace with a max. path length of 8 -->
<integer name="maxDepth" value="8"/>
</integrator>
<integrator type="path">
<!-- Path trace with a max. path length of 8 -->
<integer name="maxDepth" value="8"/>
</integrator>
<!-- Instantiate a perspective camera with 45 degrees field of view -->
<sensor type="perspective">
<!-- Rotate the camera around the Y axis by 180 degrees -->
<transform name="toWorld">
<rotate y="1" angle="180"/>
</transform>
<float name="fov" value="45"/>
<!-- Instantiate a perspective camera with 45 degrees field of view -->
<sensor type="perspective">
<!-- Rotate the camera around the Y axis by 180 degrees -->
<transform name="toWorld">
<rotate y="1" angle="180"/>
</transform>
<float name="fov" value="45"/>
<!-- Render with 32 samples per pixel using a basic
<!-- Render with 32 samples per pixel using a basic
independent sampling strategy -->
<sampler type="independent">
<integer name="sampleCount" value="32"/>
</sampler>
<sampler type="independent">
<integer name="sampleCount" value="32"/>
</sampler>
<!-- Generate an EXR image at HD resolution -->
<film type="hdrfilm">
<integer name="width" value="1920"/>
<integer name="height" value="1080"/>
</film>
</sensor>
<!-- Generate an EXR image at HD resolution -->
<film type="hdrfilm">
<integer name="width" value="1920"/>
<integer name="height" value="1080"/>
</film>
</sensor>
<!-- Add a dragon mesh made of rough glass (stored as OBJ file) -->
<shape type="obj">
<string name="filename" value="dragon.obj"/>
<!-- Add a dragon mesh made of rough glass (stored as OBJ file) -->
<shape type="obj">
<string name="filename" value="dragon.obj"/>
<bsdf type="roughdielectric">
<!-- Tweak the roughness parameter of the material -->
<float name="alpha" value="0.01"/>
</bsdf>
</shape>
<bsdf type="roughdielectric">
<!-- Tweak the roughness parameter of the material -->
<float name="alpha" value="0.01"/>
</bsdf>
</shape>
<!-- Add another mesh -- this time, stored using Mitsuba's own
(compact) binary representation -->
<shape type="serialized">
<string name="filename" value="lightsource.serialized"/>
<transform name="toWorld">
<translate x="5" y="-3" z="1"/>
</transform>
<!-- Add another mesh -- this time, stored using Mitsuba's own
(compact) binary representation -->
<shape type="serialized">
<string name="filename" value="lightsource.serialized"/>
<transform name="toWorld">
<translate x="5" y="-3" z="1"/>
</transform>
<!-- This mesh is an area emitter -->
<emitter type="area">
<rgb name="radiance" value="100,400,100"/>
</emitter>
</shape>
<!-- This mesh is an area emitter -->
<emitter type="area">
<rgb name="radiance" value="100,400,100"/>
</emitter>
</shape>
</scene>
\end{xml}
This example introduces several new object types (\code{integrator, sensor, bsdf, sampler, film}, and \code{emitter})
@ -211,10 +211,10 @@ are allowed. Here is an example:
\end{xml}
\renderings{
\fbox{\includegraphics[width=10cm]{images/blackbody}}
\hfill\,
\caption{\label{fig:blackbody}A few simulated
black body emitters over a range of temperature values}
\fbox{\includegraphics[width=10cm]{images/blackbody}}
\hfill\,
\caption{\label{fig:blackbody}A few simulated
black body emitters over a range of temperature values}
}
\label{sec:blackbody}
Finally, it is also possible to specify the spectral distribution of a black body emitter (\figref{blackbody}),
@ -252,8 +252,8 @@ with the identity, one can build up a transformation using a sequence of command
does a translation followed by a rotation might be written like this:
\begin{xml}
<transform name="trafoProperty">
<translate x="-1" y="3" z="4"/>
<rotate y="1" angle="45"/>
<translate x="-1" y="3" z="4"/>
<rotate y="1" angle="45"/>
</transform>
\end{xml}
Mathematically, each incremental transformation in the sequence is left-multiplied onto the current one. The following
@ -323,22 +323,22 @@ to declare it over and over again, which wastes memory, you can make use of refe
of how this works:
\begin{xml}
<scene version=$\MtsVer$>
<texture type="bitmap" id="myImage">
<string name="filename" value="textures/myImage.jpg"/>
</texture>
<texture type="bitmap" id="myImage">
<string name="filename" value="textures/myImage.jpg"/>
</texture>
<bsdf type="diffuse" id="myMaterial">
<!-- Reference the texture named myImage and pass it
to the BRDF as the reflectance parameter -->
<ref name="reflectance" id="myImage"/>
</bsdf>
<bsdf type="diffuse" id="myMaterial">
<!-- Reference the texture named myImage and pass it
to the BRDF as the reflectance parameter -->
<ref name="reflectance" id="myImage"/>
</bsdf>
<shape type="obj">
<string name="filename" value="meshes/myShape.obj"/>
<shape type="obj">
<string name="filename" value="meshes/myShape.obj"/>
<!-- Reference the material named myMaterial -->
<ref id="myMaterial"/>
</shape>
<!-- Reference the material named myMaterial -->
<ref id="myMaterial"/>
</shape>
</scene>
\end{xml}
By providing a unique \texttt{id} attribute in the

View File

@ -40,7 +40,7 @@ MTS_NAMESPACE_BEGIN
class MyIntegrator : public SamplingIntegrator {
public:
MTS_DECLARE_CLASS()
MTS_DECLARE_CLASS()
};
MTS_IMPLEMENT_CLASS_S(MyIntegrator, false, SamplingIntegrator)
@ -87,7 +87,7 @@ public:
}
private:
Spectrum m_color;
Spectrum m_color;
\end{cpp}
This code fragment sets up a default color (a light shade of green), which
@ -96,7 +96,7 @@ the integrator from an XML document like this
\begin{xml}
<integrator type="myIntegrator">
<spectrum name="color" value="1.0"/>
<spectrum name="color" value="1.0"/>
</integrator>
\end{xml}
in which case white would take preference.
@ -189,11 +189,11 @@ substituted based the compilation flags. This variable constitutes local
state, thus it must not be forgotten in the serialization- and unserialization routines:
append
\begin{cpp}
m_maxDist = stream->readFloat();
m_maxDist = stream->readFloat();
\end{cpp}
and
\begin{cpp}
stream->writeFloat(m_maxDist);
stream->writeFloat(m_maxDist);
\end{cpp}
to the unserialization constructor and the \code{serialize} method, respectively.
@ -202,24 +202,24 @@ distance to all corners of the bounding box, which encloses the scene.
To avoid having to do this every time \code{Li()} is called,
we can override the \code{preprocess} function:
\begin{cpp}
/// Preprocess function -- called on the initiating machine
bool preprocess(const Scene *scene, RenderQueue *queue,
const RenderJob *job, int sceneResID, int cameraResID,
int samplerResID) {
SamplingIntegrator::preprocess(scene, queue, job, sceneResID,
cameraResID, samplerResID);
/// Preprocess function -- called on the initiating machine
bool preprocess(const Scene *scene, RenderQueue *queue,
const RenderJob *job, int sceneResID, int cameraResID,
int samplerResID) {
SamplingIntegrator::preprocess(scene, queue, job, sceneResID,
cameraResID, samplerResID);
const AABB &sceneAABB = scene->getAABB();
const AABB &sceneAABB = scene->getAABB();
/* Find the camera position at t=0 seconds */
Point cameraPosition = scene->getSensor()->getWorldTransform()->eval(0).transformAffine(Point(0.0f));
m_maxDist = - std::numeric_limits<Float>::infinity();
Point cameraPosition = scene->getSensor()->getWorldTransform()->eval(0).transformAffine(Point(0.0f));
m_maxDist = - std::numeric_limits<Float>::infinity();
for (int i=0; i<8; ++i)
m_maxDist = std::max(m_maxDist,
(cameraPosition - sceneAABB.getCorner(i)).length());
for (int i=0; i<8; ++i)
m_maxDist = std::max(m_maxDist,
(cameraPosition - sceneAABB.getCorner(i)).length());
return true;
}
return true;
}
\end{cpp}
The bottom of this function should be relatively self-explanatory. The
numerous arguments at the top are related to the parallelization layer, which will be
@ -238,11 +238,11 @@ other nodes before the rendering begins.
Now, replace the body of the \code{Li} method with
\begin{cpp}
if (rRec.rayIntersect(r)) {
Float distance = rRec.its.t;
return Spectrum(1.0f - distance/m_maxDist) * m_color;
}
return Spectrum(0.0f);
if (rRec.rayIntersect(r)) {
Float distance = rRec.its.t;
return Spectrum(1.0f - distance/m_maxDist) * m_color;
}
return Spectrum(0.0f);
\end{cpp}
and the distance renderer is done!
\begin{center}
@ -251,11 +251,11 @@ and the distance renderer is done!
There are a few more noteworthy details: first of all, the ``usual'' way
to intersect a ray against the scene actually works like this:
\begin{cpp}
Intersection its;
Ray ray = ...;
if (scene->rayIntersect(ray, its)) {
/* Do something with the intersection stored in 'its' */
}
Intersection its;
Ray ray = ...;
if (scene->rayIntersect(ray, its)) {
/* Do something with the intersection stored in 'its' */
}
\end{cpp}
As you can see, we did something slightly different in the distance
renderer fragment above (we called \code{RadianceQueryRecord::rayIntersect()}
@ -269,11 +269,11 @@ into a scene XML file:
\begin{xml}
<!-- Adaptively integrate using the nested technique -->
<integrator type="adaptive">
<!-- Irradiance caching + final gathering with the nested technique -->
<integrator type="irrcache">
<!-- Simple direct illumination technique -->
<integrator type="direct">
</integrator>
<!-- Irradiance caching + final gathering with the nested technique -->
<integrator type="irrcache">
<!-- Simple direct illumination technique -->
<integrator type="direct">
</integrator>
</integrator>
\end{xml}
To support this kind of complex interaction, some information needs to be passed between the
@ -294,16 +294,16 @@ as possible. Your overall code might for example be structured like this:
\begin{cpp}
Spectrum Li(const RayDifferential &r, RadianceQueryRecord &rRec) const {
Spectrum result;
Spectrum result;
if (rRec.type & RadianceQueryRecord::EEmittedRadiance) {
// Emitted surface radiance contribution was requested
result += ...;
}
result += ...;
}
if (rRec.type & RadianceQueryRecord::EDirectRadiance) {
// Direct illumination contribution was requested
result += ...;
}
...
return result;
result += ...;
}
...
return result;
}
\end{cpp}

View File

@ -91,8 +91,8 @@ Mitsuba is free software and can be redistributed and modified under the terms o
Public License (Version 3) as provided by the Free Software Foundation.
\remarks{
\item Being a ``viral'' license, the GPL automatically applies to all
derivative work. Amongst other things, this means that without express
permission, Mitsuba's source code is \emph{off-limits} to companies that
develop rendering software not distributed under a compatible license.
\item Being a ``viral'' license, the GPL automatically applies to all
derivative work. Amongst other things, this means that without express
permission, Mitsuba's source code is \emph{off-limits} to companies that
develop rendering software not distributed under a compatible license.
}

View File

@ -61,19 +61,19 @@
\pagestyle{scrheadings}
\usepackage[
bookmarks,
bookmarksnumbered,
colorlinks,
plainpages=false,
pdfpagelabels,
hypertexnames=false,
linkcolor=myblue,
urlcolor=myblue,
citecolor=myblue,
pdfpagelabels,
pdftitle={Mitsuba \MitsubaVersion\, Documentation},
pdfauthor={Wenzel Jakob},
pdfstartview=FitH
bookmarks,
bookmarksnumbered,
colorlinks,
plainpages=false,
pdfpagelabels,
hypertexnames=false,
linkcolor=myblue,
urlcolor=myblue,
citecolor=myblue,
pdfpagelabels,
pdftitle={Mitsuba \MitsubaVersion\, Documentation},
pdfauthor={Wenzel Jakob},
pdfstartview=FitH
]{hyperref}
\definecolor{myblue}{rgb}{0,.1,.6}
@ -85,40 +85,47 @@
\definecolor{remark}{rgb}{1.0, 0.9, 0.9}
\definecolor{remarkframe}{rgb}{1.0, 0.7, 0.7}
% requires the latest version of package accsupp
\usepackage[space=true]{accsupp}
\newcommand{\copyablespace}{\BeginAccSupp{method=hex,unicode,ActualText=00A0}\ \EndAccSupp{}}
% Listings settings
\lstset{
basicstyle = \small\ttfamily\raggedright,
commentstyle=\color{lstcomment}\itshape,
stringstyle=\color{lstattrib},
mathescape = true,
frame = lrtb,
backgroundcolor = \color{lstshade},
rulecolor = \color{lstframe},
tabsize = 4,
columns = flexible,
keepspaces,
belowskip = \smallskipamount,
framerule = .7pt,
breaklines = true,
showstringspaces = false,
keywordstyle = \bfseries,
captionpos = b,
upquote = true
basicstyle = \small\ttfamily\raggedright,
commentstyle=\color{lstcomment}\itshape,
stringstyle=\color{lstattrib},
mathescape = true,
frame = lrtb,
backgroundcolor = \color{lstshade},
rulecolor = \color{lstframe},
tabsize = 4,
columns = fullflexible,
keepspaces,
belowskip = \smallskipamount,
framerule = .7pt,
breaklines = true,
showstringspaces = false,
keywordstyle = \bfseries,
captionpos = b,
upquote = true,
literate={*}{{\char42}}1
{-}{{\char45}}1
{\ }{{\copyablespace}}1
}
\lstdefinelanguage{xml} {
sensitive=true,
morecomment=[s][\color{lstcomment}\itshape]{<!--}{-->},
morecomment=[s][\color{lstcomment}]{<?}{?>},
string=[b]", stringstyle=\color{lstattrib},
keywords= [1] {
shape,bsdf,scene,texture,phase,integer,float,
string,transform,ref,rgb,srgb,spectrum,blackbody,
medium,film,sampler,integrator,emitter,sensor,
translate,rotate,scale,lookat,point,vector,matrix,
include,fscat,volume,alias,rfilter,boolean,
subsurface,animation
},
sensitive=true,
morecomment=[s][\color{lstcomment}\itshape]{<!--}{-->},
morecomment=[s][\color{lstcomment}]{<?}{?>},
string=[b]", stringstyle=\color{lstattrib},
keywords= [1] {
shape,bsdf,scene,texture,phase,integer,float,
string,transform,ref,rgb,srgb,spectrum,blackbody,
medium,film,sampler,integrator,emitter,sensor,
translate,rotate,scale,lookat,point,vector,matrix,
include,fscat,volume,alias,rfilter,boolean,
subsurface,animation
},
}
@ -133,25 +140,25 @@
\setlength{\intextsep}{3pt}
\lstnewenvironment{shell}[1][]{\lstset{#1}}
{}
{}
\lstnewenvironment{cpp}[1][]{\lstset{language=c++, #1}}
{}
{}
\lstnewenvironment{python}[1][]{\lstset{language=Python, #1}}
{}
{}
\lstnewenvironment{xml}[1][]{\lstset{language=xml, #1}}
{}
{}
\lstnewenvironment{console}[1][]{\lstset{basicstyle=\footnotesize\ttfamily, float, #1}}
{}
{}
% ----- 8< ----- 8< ------
\title{
\vspace{3cm}
\includegraphics[width=4cm]{images/logo_plain.pdf}\\\vspace{1.5cm}
\Huge
Mitsuba Documentation\\\vspace{4mm}
\LARGE Version \MitsubaVersion
\vspace{5mm}
\vspace{3cm}
\includegraphics[width=4cm]{images/logo_plain.pdf}\\\vspace{1.5cm}
\Huge
Mitsuba Documentation\\\vspace{4mm}
\LARGE Version \MitsubaVersion
\vspace{5mm}
}
\author{Wenzel Jakob}
\date{\today}

View File

@ -36,12 +36,12 @@ MTS_NAMESPACE_BEGIN
class ROT13Encoder : public Utility {
public:
int run(int argc, char **argv) {
cout << "Hello world!" << endl;
return 0;
}
int run(int argc, char **argv) {
cout << "Hello world!" << endl;
return 0;
}
MTS_DECLARE_UTILITY()
MTS_DECLARE_UTILITY()
};
MTS_EXPORT_UTILITY(ROT13Encoder, "Perform a ROT13 encryption of a string")
@ -58,8 +58,8 @@ $\texttt{\$}$ mtsutil
..
The following utilities are available:
addimages Generate linear combinations of EXR images
rot13 Perform a ROT13 encryption of a string
addimages Generate linear combinations of EXR images
rot13 Perform a ROT13 encryption of a string
\end{shell}
It can be executed as follows:
\begin{shell}
@ -82,22 +82,22 @@ For reference, here are the interfaces of \code{WorkUnit} and \code{WorkResult}:
*/
class MTS_EXPORT_CORE WorkUnit : public Object {
public:
/// Copy the content of another work unit of the same type
virtual void set(const WorkUnit *workUnit) = 0;
/// Copy the content of another work unit of the same type
virtual void set(const WorkUnit *workUnit) = 0;
/// Fill the work unit with content acquired from a binary data stream
virtual void load(Stream *stream) = 0;
/// Fill the work unit with content acquired from a binary data stream
virtual void load(Stream *stream) = 0;
/// Serialize a work unit to a binary data stream
virtual void save(Stream *stream) const = 0;
/// Serialize a work unit to a binary data stream
virtual void save(Stream *stream) const = 0;
/// Return a string representation
virtual std::string toString() const = 0;
/// Return a string representation
virtual std::string toString() const = 0;
MTS_DECLARE_CLASS()
MTS_DECLARE_CLASS()
protected:
/// Virtual destructor
virtual ~WorkUnit() { }
/// Virtual destructor
virtual ~WorkUnit() { }
};
/**
* Abstract work result. Represents the information that encodes
@ -105,60 +105,60 @@ protected:
*/
class MTS_EXPORT_CORE WorkResult : public Object {
public:
/// Fill the work result with content acquired from a binary data stream
virtual void load(Stream *stream) = 0;
/// Fill the work result with content acquired from a binary data stream
virtual void load(Stream *stream) = 0;
/// Serialize a work result to a binary data stream
virtual void save(Stream *stream) const = 0;
/// Serialize a work result to a binary data stream
virtual void save(Stream *stream) const = 0;
/// Return a string representation
virtual std::string toString() const = 0;
/// Return a string representation
virtual std::string toString() const = 0;
MTS_DECLARE_CLASS()
MTS_DECLARE_CLASS()
protected:
/// Virtual destructor
virtual ~WorkResult() { }
/// Virtual destructor
virtual ~WorkResult() { }
};
\end{cpp}
In our case, the \code{WorkUnit} implementation then looks like this:
\begin{cpp}
class ROT13WorkUnit : public WorkUnit {
public:
void set(const WorkUnit *workUnit) {
const ROT13WorkUnit *wu =
static_cast<const ROT13WorkUnit *>(workUnit);
m_char = wu->m_char;
m_pos = wu->m_pos;
}
void set(const WorkUnit *workUnit) {
const ROT13WorkUnit *wu =
static_cast<const ROT13WorkUnit *>(workUnit);
m_char = wu->m_char;
m_pos = wu->m_pos;
}
void load(Stream *stream) {
m_char = stream->readChar();
m_pos = stream->readInt();
}
void load(Stream *stream) {
m_char = stream->readChar();
m_pos = stream->readInt();
}
void save(Stream *stream) const {
stream->writeChar(m_char);
stream->writeInt(m_pos);
}
void save(Stream *stream) const {
stream->writeChar(m_char);
stream->writeInt(m_pos);
}
std::string toString() const {
std::ostringstream oss;
oss << "ROT13WorkUnit[" << endl
<< " char = '" << m_char << "'," << endl
<< " pos = " << m_pos << endl
<< "]";
return oss.str();
}
std::string toString() const {
std::ostringstream oss;
oss << "ROT13WorkUnit[" << endl
<< " char = '" << m_char << "'," << endl
<< " pos = " << m_pos << endl
<< "]";
return oss.str();
}
inline char getChar() const { return m_char; }
inline void setChar(char value) { m_char = value; }
inline int getPos() const { return m_pos; }
inline void setPos(int value) { m_pos = value; }
inline char getChar() const { return m_char; }
inline void setChar(char value) { m_char = value; }
inline int getPos() const { return m_pos; }
inline void setPos(int value) { m_pos = value; }
MTS_DECLARE_CLASS()
MTS_DECLARE_CLASS()
private:
char m_char;
int m_pos;
char m_char;
int m_pos;
};
MTS_IMPLEMENT_CLASS(ROT13WorkUnit, false, WorkUnit)
@ -175,42 +175,42 @@ remote worker nodes and replicated amongst local threads.
\begin{cpp}
class ROT13WorkProcessor : public WorkProcessor {
public:
/// Construct a new work processor
ROT13WorkProcessor() : WorkProcessor() { }
/// Construct a new work processor
ROT13WorkProcessor() : WorkProcessor() { }
/// Unserialize from a binary data stream (nothing to do in our case)
ROT13WorkProcessor(Stream *stream, InstanceManager *manager)
: WorkProcessor(stream, manager) { }
/// Unserialize from a binary data stream (nothing to do in our case)
ROT13WorkProcessor(Stream *stream, InstanceManager *manager)
: WorkProcessor(stream, manager) { }
/// Serialize to a binary data stream (nothing to do in our case)
void serialize(Stream *stream, InstanceManager *manager) const {
}
/// Serialize to a binary data stream (nothing to do in our case)
void serialize(Stream *stream, InstanceManager *manager) const {
}
ref<WorkUnit> createWorkUnit() const {
return new ROT13WorkUnit();
}
ref<WorkUnit> createWorkUnit() const {
return new ROT13WorkUnit();
}
ref<WorkResult> createWorkResult() const {
return new ROT13WorkResult();
}
ref<WorkResult> createWorkResult() const {
return new ROT13WorkResult();
}
ref<WorkProcessor> clone() const {
return new ROT13WorkProcessor(); // No state to clone in our case
}
ref<WorkProcessor> clone() const {
return new ROT13WorkProcessor(); // No state to clone in our case
}
/// No internal state, thus no preparation is necessary
void prepare() { }
/// No internal state, thus no preparation is necessary
void prepare() { }
/// Do the actual computation
void process(const WorkUnit *workUnit, WorkResult *workResult,
const bool &stop) {
const ROT13WorkUnit *wu
= static_cast<const ROT13WorkUnit *>(workUnit);
ROT13WorkResult *wr = static_cast<ROT13WorkResult *>(workResult);
wr->setPos(wu->getPos());
wr->setChar((std::toupper(wu->getChar()) - 'A' + 13) % 26 + 'A');
}
MTS_DECLARE_CLASS()
/// Do the actual computation
void process(const WorkUnit *workUnit, WorkResult *workResult,
const bool &stop) {
const ROT13WorkUnit *wu
= static_cast<const ROT13WorkUnit *>(workUnit);
ROT13WorkResult *wr = static_cast<ROT13WorkResult *>(workResult);
wr->setPos(wu->getPos());
wr->setChar((std::toupper(wu->getChar()) - 'A' + 13) % 26 + 'A');
}
MTS_DECLARE_CLASS()
};
MTS_IMPLEMENT_CLASS_S(ROT13WorkProcessor, false, WorkProcessor)
\end{cpp}
@ -226,48 +226,48 @@ implementation might look as follows:
\begin{cpp}
class ROT13Process : public ParallelProcess {
public:
ROT13Process(const std::string &input) : m_input(input), m_pos(0) {
m_output.resize(m_input.length());
}
ROT13Process(const std::string &input) : m_input(input), m_pos(0) {
m_output.resize(m_input.length());
}
ref<WorkProcessor> createWorkProcessor() const {
return new ROT13WorkProcessor();
}
ref<WorkProcessor> createWorkProcessor() const {
return new ROT13WorkProcessor();
}
std::vector<std::string> getRequiredPlugins() {
std::vector<std::string> result;
result.push_back("rot13");
return result;
}
std::vector<std::string> getRequiredPlugins() {
std::vector<std::string> result;
result.push_back("rot13");
return result;
}
EStatus generateWork(WorkUnit *unit, int worker /* unused */) {
if (m_pos >= (int) m_input.length())
return EFailure;
ROT13WorkUnit *wu = static_cast<ROT13WorkUnit *>(unit);
EStatus generateWork(WorkUnit *unit, int worker /* unused */) {
if (m_pos >= (int) m_input.length())
return EFailure;
ROT13WorkUnit *wu = static_cast<ROT13WorkUnit *>(unit);
wu->setPos(m_pos);
wu->setChar(m_input[m_pos++]);
wu->setPos(m_pos);
wu->setChar(m_input[m_pos++]);
return ESuccess;
}
return ESuccess;
}
void processResult(const WorkResult *result, bool cancelled) {
if (cancelled) // indicates a work unit, which was
return; // cancelled partly through its execution
const ROT13WorkResult *wr =
static_cast<const ROT13WorkResult *>(result);
m_output[wr->getPos()] = wr->getChar();
}
void processResult(const WorkResult *result, bool cancelled) {
if (cancelled) // indicates a work unit, which was
return; // cancelled partly through its execution
const ROT13WorkResult *wr =
static_cast<const ROT13WorkResult *>(result);
m_output[wr->getPos()] = wr->getChar();
}
inline const std::string &getOutput() {
return m_output;
}
inline const std::string &getOutput() {
return m_output;
}
MTS_DECLARE_CLASS()
MTS_DECLARE_CLASS()
public:
std::string m_input;
std::string m_output;
int m_pos;
std::string m_input;
std::string m_output;
int m_pos;
};
MTS_IMPLEMENT_CLASS(ROT13Process, false, ParallelProcess)
\end{cpp}
@ -281,25 +281,25 @@ loads the \code{ROT13*} classes at the right moment.
To actually use the \code{ROT13} encoder, we must first launch the newly created parallel process
from the main utility function (the `Hello World' code we wrote earlier). We can adapt it as follows:
\begin{cpp}
int run(int argc, char **argv) {
if (argc < 2) {
cout << "Syntax: mtsutil rot13 <text>" << endl;
return -1;
}
int run(int argc, char **argv) {
if (argc < 2) {
cout << "Syntax: mtsutil rot13 <text>" << endl;
return -1;
}
ref<ROT13Process> proc = new ROT13Process(argv[1]);
ref<Scheduler> sched = Scheduler::getInstance();
ref<ROT13Process> proc = new ROT13Process(argv[1]);
ref<Scheduler> sched = Scheduler::getInstance();
/* Submit the encryption job to the scheduler */
sched->schedule(proc);
/* Submit the encryption job to the scheduler */
sched->schedule(proc);
/* Wait for its completion */
sched->wait(proc);
/* Wait for its completion */
sched->wait(proc);
cout << "Result: " << proc->getOutput() << endl;
cout << "Result: " << proc->getOutput() << endl;
return 0;
}
return 0;
}
\end{cpp}
After compiling everything using \code{scons}, a simple example
involving the utility would be to encode a string (e.g. \code{SECUREBYDESIGN}), while

View File

@ -10,21 +10,21 @@ The documentation of a plugin always starts on a new page and is preceded
by a table similar to the one below:
\parameters{
\parameter{softRays}{\Boolean}{
Try not to damage objects in the scene by shooting softer rays
\default{\code{false}}
}
Try not to damage objects in the scene by shooting softer rays
\default{\code{false}}
}
\parameter{darkMatter}{\Float}{
Controls the proportionate amount of dark matter present in the scene.
\default{0.83}
}
Controls the proportionate amount of dark matter present in the scene.
\default{0.83}
}
}
Suppose this hypothetical plugin is an \emph{integrator} named \code{amazing}. Then, based on
this description, it can be instantiated from an XML scene file using a custom configuration such as:
\begin{xml}
<integrator type="amazing">
<boolean name="softerRays" value="true"/>
<float name="darkMatter" value="0.4"/>
<boolean name="softerRays" value="true"/>
<float name="darkMatter" value="0.4"/>
</integrator>
\end{xml}
In some cases\footnote{Note that obvious parameters are generally omitted.
@ -33,20 +33,20 @@ is left out from the documentation for brevity.}, plugins also indicate that the
as input arguments. These can either be \emph{named} or \emph{unnamed}. If
the \code{amazing} integrator also accepted the following two parameters\vspace{-2mm}
\parameters{
\parameter{\Unnamed}{\Integrator}{A nested integrator which does the actual hard work}
\parameter{puppies}{\Texture}{This must be used to supply a \mbox{cute picture of puppies}}
\parameter{\Unnamed}{\Integrator}{A nested integrator which does the actual hard work}
\parameter{puppies}{\Texture}{This must be used to supply a \mbox{cute picture of puppies}}
}
\vspace{-1mm}
then it can be instantiated e.g. as follows
\begin{xml}
<integrator type="amazing">
<boolean name="softerRays" value="true"/>
<float name="darkMatter" value="0.4"/>
<integrator type="path"/>
<texture name="puppies" type="bitmap">
<string name="filename" value="cute.jpg"/>
</texture>
<boolean name="softerRays" value="true"/>
<float name="darkMatter" value="0.4"/>
<integrator type="path"/>
<texture name="puppies" type="bitmap">
<string name="filename" value="cute.jpg"/>
</texture>
</integrator>
\end{xml}
or, if these were already instantiated previously and are now
@ -54,10 +54,10 @@ bound to the \emph{identifiers} (\secref{format}) \code{myPathTracer} and
\code{myTexture}, the following also works:
\begin{xml}
<integrator type="amazing">
<boolean name="softerRays" value="true"/>
<float name="darkMatter" value="0.4"/>
<ref id="myPathTracer"/>
<ref name="puppies" id="myTexture"/>
<boolean name="softerRays" value="true"/>
<float name="darkMatter" value="0.4"/>
<ref id="myPathTracer"/>
<ref name="puppies" id="myTexture"/>
</integrator>
\end{xml}

View File

@ -153,7 +153,7 @@ scheduler = Scheduler.getInstance()
# Start up the scheduling system with one worker per local core
for i in range(0, multiprocessing.cpu_count()):
scheduler.registerWorker(LocalWorker(i, 'wrk%i' % i))
scheduler.registerWorker(LocalWorker(i, 'wrk%i' % i))
scheduler.start()
# Create a queue for tracking render jobs
@ -204,9 +204,9 @@ pmgr = PluginManager.getInstance()
# Encodes parameters on how to instantiate the 'perspective' plugin
sensorProps = Properties('perspective')
sensorProps['toWorld'] = Transform.lookAt(
Point(0, 0, -10), # Camera origin
Point(0, 0, 0), # Camera target
Vector(0, 1, 0) # 'up' vector
Point(0, 0, -10), # Camera origin
Point(0, 0, 0), # Camera target
Vector(0, 1, 0) # 'up' vector
)
sensorProps['fov'] = 45.0
@ -243,17 +243,17 @@ from mitsuba.core import *
pmgr = PluginManager.getInstance()
sensor = pmgr.create({
'type' : 'perspective',
'toWorld' : Transform.lookAt(
Point(0, 0, -10),
Point(0, 0, 0),
Vector(0, 1, 0)
),
'film' : {
'type' : 'ldrfilm',
'width' : 1920,
'height' : 1080
}
'type' : 'perspective',
'toWorld' : Transform.lookAt(
Point(0, 0, -10),
Point(0, 0, 0),
Vector(0, 1, 0)
),
'film' : {
'type' : 'ldrfilm',
'width' : 1920,
'height' : 1080
}
})
\end{python}
This code does exactly the same as the previous snippet.
@ -273,44 +273,44 @@ scene = Scene()
# Create a sensor, film & sample generator
scene.addChild(pmgr.create({
'type' : 'perspective',
'toWorld' : Transform.lookAt(
Point(0, 0, -10),
Point(0, 0, 0),
Vector(0, 1, 0)
),
'film' : {
'type' : 'ldrfilm',
'width' : 1920,
'height' : 1080
},
'sampler' : {
'type' : 'ldsampler',
'sampleCount' : 2
}
'type' : 'perspective',
'toWorld' : Transform.lookAt(
Point(0, 0, -10),
Point(0, 0, 0),
Vector(0, 1, 0)
),
'film' : {
'type' : 'ldrfilm',
'width' : 1920,
'height' : 1080
},
'sampler' : {
'type' : 'ldsampler',
'sampleCount' : 2
}
}))
# Set the integrator
scene.addChild(pmgr.create({
'type' : 'direct'
'type' : 'direct'
}))
# Add a light source
scene.addChild(pmgr.create({
'type' : 'point',
'position' : Point(5, 0, -10),
'intensity' : Spectrum(100)
'type' : 'point',
'position' : Point(5, 0, -10),
'intensity' : Spectrum(100)
}))
# Add a shape
scene.addChild(pmgr.create({
'type' : 'sphere',
'center' : Point(0, 0, 0),
'radius' : 1.0,
'bsdf' : {
'type' : 'diffuse',
'reflectance' : Spectrum(0.4)
}
'type' : 'sphere',
'center' : Point(0, 0, 0),
'radius' : 1.0,
'bsdf' : {
'type' : 'diffuse',
'reflectance' : Spectrum(0.4)
}
}))
scene.configure()
@ -334,17 +334,17 @@ import mitsuba
from mitsuba.core import *
class MyFormatter(Formatter):
def format(self, logLevel, sourceClass, sourceThread, message, filename, line):
return '%s (log level: %s, thread: %s, class %s, file %s, line %i)' % \
(message, str(logLevel), sourceThread.getName(), sourceClass,
filename, line)
def format(self, logLevel, sourceClass, sourceThread, message, filename, line):
return '%s (log level: %s, thread: %s, class %s, file %s, line %i)' % \
(message, str(logLevel), sourceThread.getName(), sourceClass,
filename, line)
class MyAppender(Appender):
def append(self, logLevel, message):
print(message)
def append(self, logLevel, message):
print(message)
def logProgress(self, progress, name, formatted, eta):
print('Progress message: ' + formatted)
def logProgress(self, progress, name, formatted, eta):
print('Progress message: ' + formatted)
# Get the logger associated with the current thread
logger = Thread.getThread().getLogger()
@ -810,3 +810,32 @@ Suppose that \code{bitmap} contains a \code{mitsuba.core.Bitmap} instance (e.g.
import numpy as np
array = np.array(bitmap.getNativeBuffer())
\end{python}
\subsubsection{Simultaneously rendering multiple versions of a scene}
\begin{python}
for i in range(2):
destination = 'renderedResult' + str(i)
# Create a shallow copy of the scene so that the queue can tell apart the two
# rendering processes. This takes almost no extra memory
newScene = Scene(scene)
newSensor = PluginManager.getInstance().createObject(scene.getSensor().getProperties())
newFilm = PluginManager.getInstance().createObject(scene.getFilm().getProperties())
newFilm.configure()
newSensor.addChild(newFilm)
newSensor.configure()
newScene.addSensor(newSensor)
newScene.setSensor(newSensor)
newScene.setSampler(scene.getSampler())
newScene.setDestinationFile(destination)
# Create a render job and insert it into the queue
job = RenderJob('myRenderJob' + str(i), newScene, queue, sceneResID)
job.start()
# Wait for all jobs to finish and release resources
# It works when this is moved in the loop when only one job at a time is started
queue.waitLeft(0)
queue.join()
\end{python}

View File

@ -4,10 +4,10 @@
\centering
\includegraphics[width=15.5cm]{images/bsdf_overview.pdf}
\caption{
Schematic overview of the most important surface scattering models in
Mitsuba (shown in the style of Weidlich and Wilkie \cite{Weidlich2007Arbitrarily}). The arrows indicate possible outcomes of an
interaction with a surface that has the respective model applied to it.
\vspace{4mm}
Schematic overview of the most important surface scattering models in
Mitsuba (shown in the style of Weidlich and Wilkie \cite{Weidlich2007Arbitrarily}). The arrows indicate possible outcomes of an
interaction with a surface that has the respective model applied to it.
\vspace{4mm}
}
\end{figure}
@ -44,22 +44,22 @@ be named and then later referenced by their name.
The following fragment shows an example of both kinds of usages:
\begin{xml}
<scene version=$\MtsVer$>
<!-- Creating a named BSDF for later use -->
<bsdf type=".. BSDF type .." id="myNamedMaterial">
<!-- BSDF parameters go here -->
</bsdf>
<!-- Creating a named BSDF for later use -->
<bsdf type=".. BSDF type .." id="myNamedMaterial">
<!-- BSDF parameters go here -->
</bsdf>
<shape type="sphere">
<!-- Example of referencing a named material -->
<ref id="myNamedMaterial"/>
</shape>
<shape type="sphere">
<!-- Example of referencing a named material -->
<ref id="myNamedMaterial"/>
</shape>
<shape type="sphere">
<!-- Example of instantiating an unnamed material -->
<bsdf type=".. BSDF type ..">
<!-- BSDF parameters go here -->
</bsdf>
</shape>
<shape type="sphere">
<!-- Example of instantiating an unnamed material -->
<bsdf type=".. BSDF type ..">
<!-- BSDF parameters go here -->
</bsdf>
</shape>
</scene>
\end{xml}
It is generally more economical to use named BSDFs when they
@ -72,13 +72,13 @@ memory usage.
\includegraphics[width=15cm]{images/glass_explanation.pdf}
\vspace{-5mm}
\caption{
\label{fig:glass-explanation}
Some of the scattering models in Mitsuba need to know
the indices of refraction on the exterior and interior-facing
side of a surface.
It is therefore important to decompose the mesh into meaningful
separate surfaces corresponding to each index of refraction change.
The example here shows such a decomposition for a water-filled Glass.
\label{fig:glass-explanation}
Some of the scattering models in Mitsuba need to know
the indices of refraction on the exterior and interior-facing
side of a surface.
It is therefore important to decompose the mesh into meaningful
separate surfaces corresponding to each index of refraction change.
The example here shows such a decomposition for a water-filled Glass.
}
\end{figure}

View File

@ -9,7 +9,7 @@ types is shown below:
\centering
\includegraphics[width=15.5cm]{images/emitter_overview.pdf}
\caption{
Schematic overview of the most important emitters in Mitsuba.
The arrows indicate the directional distribution of light.
Schematic overview of the most important emitters in Mitsuba.
The arrows indicate the directional distribution of light.
}
\end{figure}

View File

@ -9,22 +9,22 @@ more scientifically oriented data formats (e.g. MATLAB or Mathematica).
In the XML scene description language, a normal film configuration might look as follows
\begin{xml}
<scene version=$\MtsVer$>
<!-- ... scene contents ... -->
<!-- ... scene contents ... -->
<sensor type="... sensor type ...">
<!-- ... sensor parameters ... -->
<sensor type="... sensor type ...">
<!-- ... sensor parameters ... -->
<!-- Write to a high dynamic range EXR image -->
<film type="hdrfilm">
<!-- Specify the desired resolution (e.g. full HD) -->
<integer name="width" value="1920"/>
<integer name="height" value="1080"/>
<!-- Write to a high dynamic range EXR image -->
<film type="hdrfilm">
<!-- Specify the desired resolution (e.g. full HD) -->
<integer name="width" value="1920"/>
<integer name="height" value="1080"/>
<!-- Use a Gaussian reconstruction filter. For
details on these, refer to the next subsection -->
<rfilter type="gaussian"/>
</film>
</sensor>
<!-- Use a Gaussian reconstruction filter. For
details on these, refer to the next subsection -->
<rfilter type="gaussian"/>
</film>
</sensor>
</scene>
\end{xml}
The \code{film} plugin should be instantiated nested inside a \code{sensor} declaration.

View File

@ -15,16 +15,16 @@ is usually instantiated by declaring it at the top level within the
scene, e.g.
\begin{xml}
<scene version=$\MtsVer$>
<!-- Instantiate a unidirectional path tracer,
which renders paths up to a depth of 5 -->
<integrator type="path">
<integer name="maxDepth" value="5"/>
</integrator>
<!-- Instantiate a unidirectional path tracer,
which renders paths up to a depth of 5 -->
<integrator type="path">
<integer name="maxDepth" value="5"/>
</integrator>
<!-- Some geometry to be rendered -->
<shape type="sphere">
<bsdf type="diffuse"/>
</shape>
<!-- Some geometry to be rendered -->
<shape type="sphere">
<bsdf type="diffuse"/>
</shape>
</scene>
\end{xml}
@ -68,13 +68,13 @@ method (\pluginref{mlt}, \pluginref{erpt}).
\smallrendering{Max. depth = 3}{pathdepth-3}
\smallrendering{Max. depth = $\infty$}{pathdepth-all}
\caption{
\label{fig:pathdepths}
These Cornell box renderings demonstrate the visual
effect of a maximum path depth. As the paths
are allowed to grow longer, the color saturation
increases due to multiple scattering interactions
with the colored surfaces. At the same time, the
computation time increases.
\label{fig:pathdepths}
These Cornell box renderings demonstrate the visual
effect of a maximum path depth. As the paths
are allowed to grow longer, the color saturation
increases due to multiple scattering interactions
with the colored surfaces. At the same time, the
computation time increases.
}
\end{figure}
@ -94,10 +94,10 @@ depth unlimited.
\includegraphics[width=10cm]{images/path_explanation.pdf}
\vspace{-5mm}
\caption{
\label{fig:path-explanation}
A ray of emitted light is scattered by an object and subsequently
reaches the eye/sensor.
In Mitsuba, this is a \emph{depth-2} path, since it has two edges.
\label{fig:path-explanation}
A ray of emitted light is scattered by an object and subsequently
reaches the eye/sensor.
In Mitsuba, this is a \emph{depth-2} path, since it has two edges.
}
\end{figure}
Mitsuba counts depths starting at $1$, which correspond to

View File

@ -3,16 +3,16 @@
\label{sec:media}
\vspace{-1cm}
\renderings{
\subfloat[A knitted sheep sweater (Ridged Feather pattern)]{
\fbox{\includegraphics[width=0.58\textwidth]{images/medium_sheep}}}\hfill
\subfloat[A knitted sweater for an alien character (Braid Cables pattern)]{
\fbox{\includegraphics[width=0.32\textwidth]{images/medium_alien_cables}}}\hfill
\vspace{-2mm}
\caption{Participating media are not limited to smoke or fog: they are
also great for rendering fuzzy materials such as these knitted sweaters
(made using the \pluginref{heterogeneous} and \pluginref{microflake} plugins).
Figure courtesy of Yuksel et al. \cite{Yuksel2012Stitch}, models courtesy of
Rune Spaans and Christer Sveen.}
\subfloat[A knitted sheep sweater (Ridged Feather pattern)]{
\fbox{\includegraphics[width=0.58\textwidth]{images/medium_sheep}}}\hfill
\subfloat[A knitted sweater for an alien character (Braid Cables pattern)]{
\fbox{\includegraphics[width=0.32\textwidth]{images/medium_alien_cables}}}\hfill
\vspace{-2mm}
\caption{Participating media are not limited to smoke or fog: they are
also great for rendering fuzzy materials such as these knitted sweaters
(made using the \pluginref{heterogeneous} and \pluginref{microflake} plugins).
Figure courtesy of Yuksel et al. \cite{Yuksel2012Stitch}, models courtesy of
Rune Spaans and Christer Sveen.}
}
In Mitsuba, participating media are used to simulate materials ranging from
fog, smoke, and clouds, over translucent materials such as skin or milk,
@ -31,14 +31,14 @@ referencing mechanism:
\begin{xml}
<medium type="homogeneous" id="fog">
<!-- .... homogeneous medium parameters .... -->
<!-- .... homogeneous medium parameters .... -->
</medium>
<sensor type="perspective">
<!-- .... perspective camera parameters .... -->
<!-- .... perspective camera parameters .... -->
<!-- Reference the fog medium from within the sensor declaration
to make it aware that it is embedded inside this medium -->
<ref id="fog"/>
<!-- Reference the fog medium from within the sensor declaration
to make it aware that it is embedded inside this medium -->
<ref id="fog"/>
</sensor>
\end{xml}

View File

@ -72,23 +72,23 @@ Here, a high frequency function is reconstructed at low resolutions. A good filt
resolution and attenuate the remainder to a uniform gray. The filters are ordered by their
approximate level of success at this benchmark.
\renderings{
\subfloat[A high resolution input image whose frequency decreases
towards the borders. If you are looking at this on a computer, you may
have to zoom in.]{\fbox{\includegraphics[width=0.43\textwidth]{images/rfilter_sines_input}}}
\hfill
\subfloat[A high resolution input image whose frequency decreases
towards the borders. If you are looking at this on a computer, you may
have to zoom in.]{\fbox{\includegraphics[width=0.43\textwidth]{images/rfilter_sines_input}}}
\hfill
}
\vspace{-4mm}
\renderings{
\medrendering{Box filter}{rfilter_sines_box}
\medrendering{Tent filter}{rfilter_sines_tent}
\medrendering{Gaussian filter}{rfilter_sines_gaussian}
\medrendering{Box filter}{rfilter_sines_box}
\medrendering{Tent filter}{rfilter_sines_tent}
\medrendering{Gaussian filter}{rfilter_sines_gaussian}
}
\vspace{-4mm}
\renderings{
\setcounter{subfigure}{3}
\medrendering{Mitchell-Netravali filter}{rfilter_sines_mitchell}
\medrendering{Catmull-Rom filter}{rfilter_sines_catmullrom}
\medrendering{Lanczos Sinc filter}{rfilter_sines_lanczos}
\setcounter{subfigure}{3}
\medrendering{Mitchell-Netravali filter}{rfilter_sines_mitchell}
\medrendering{Catmull-Rom filter}{rfilter_sines_catmullrom}
\medrendering{Lanczos Sinc filter}{rfilter_sines_lanczos}
}
\newpage
\subsubsection{Reconstruction filter comparison 2: ringing}
@ -97,39 +97,39 @@ image contains extreme and discontinuous brightness transitions. The
Mitchell-Netravali, Catmull-Rom, and Lanczos Sinc filters are affected by this problem.
Note the black fringing around the light source in the cropped Cornell box renderings below.
\renderings{
\rendering{Box filter}{rfilter_cbox_box}
\rendering{Tent filter}{rfilter_cbox_tent}
\rendering{Box filter}{rfilter_cbox_box}
\rendering{Tent filter}{rfilter_cbox_tent}
}
\vspace{-4mm}
\renderings{
\setcounter{subfigure}{2}
\rendering{Gaussian filter}{rfilter_cbox_gaussian}
\rendering{Mitchell-Netravali filter}{rfilter_cbox_mitchell}
\setcounter{subfigure}{2}
\rendering{Gaussian filter}{rfilter_cbox_gaussian}
\rendering{Mitchell-Netravali filter}{rfilter_cbox_mitchell}
}
\vspace{-4mm}
\renderings{
\setcounter{subfigure}{4}
\rendering{Catmull-Rom filter}{rfilter_cbox_catmullrom}
\rendering{Lanczos Sinc filter}{rfilter_cbox_lanczos}
\setcounter{subfigure}{4}
\rendering{Catmull-Rom filter}{rfilter_cbox_catmullrom}
\rendering{Lanczos Sinc filter}{rfilter_cbox_lanczos}
}
\subsubsection{Specifying a reconstruction filter}
To specify a reconstruction filter, it must be instantiated inside
the sensor's film. Below is an example:
\begin{xml}
<scene version=$\MtsVer$>
<!-- ... scene contents ... -->
<!-- ... scene contents ... -->
<sensor type="... sensor type ...">
<!-- ... sensor parameters ... -->
<sensor type="... sensor type ...">
<!-- ... sensor parameters ... -->
<film type="... film type ...">
<!-- ... film parameters ... -->
<film type="... film type ...">
<!-- ... film parameters ... -->
<!-- Instantiate a Lanczos Sinc filter with two lobes -->
<rfilter type="lanczos">
<integer name="lobes" value="2"/>
</rfilter>
</film>
</sensor>
<!-- Instantiate a Lanczos Sinc filter with two lobes -->
<rfilter type="lanczos">
<integer name="lobes" value="2"/>
</rfilter>
</film>
</sensor>
</scene>
\end{xml}

View File

@ -13,19 +13,19 @@ This BSDF characterizes what happens \emph{at the surface}. In the XML scene des
the following:
\begin{xml}
<scene version=$\MtsVer$>
<shape type="... shape type ...">
... $\code{shape}$ parameters ...
<shape type="... shape type ...">
... $\code{shape}$ parameters ...
<bsdf type="... bsdf type ...">
... $\code{bsdf}$ parameters ..
</bsdf>
<bsdf type="... bsdf type ...">
... $\code{bsdf}$ parameters ..
</bsdf>
<!-- Alternatively: reference a named BSDF that
has been declared previously
<!-- Alternatively: reference a named BSDF that
has been declared previously
<ref id="myBSDF"/>
-->
</shape>
-->
</shape>
</scene>
\end{xml}
@ -35,24 +35,24 @@ of the shape. This informs the renderer about what happens in the region of spac
\begin{xml}
<scene version=$\MtsVer$>
<shape type="... shape type ...">
... $\code{shape}$ parameters ...
<shape type="... shape type ...">
... $\code{shape}$ parameters ...
<medium name="interior" type="... medium type ...">
... $\code{medium}$ parameters ...
</medium>
<medium name="interior" type="... medium type ...">
... $\code{medium}$ parameters ...
</medium>
<medium name="exterior" type="... medium type ...">
... $\code{medium}$ parameters ...
</medium>
<medium name="exterior" type="... medium type ...">
... $\code{medium}$ parameters ...
</medium>
<!-- Alternatively: reference named media that
have been declared previously
<!-- Alternatively: reference named media that
have been declared previously
<ref name="interior" id="myMedium1"/>
<ref name="exterior" id="myMedium2"/>
-->
</shape>
-->
</shape>
</scene>
\end{xml}
@ -66,29 +66,29 @@ It is also possible to create \emph{index-mismatched} boundaries between media,
the light is affected by the boundary transition:
\begin{xml}
<scene version=$\MtsVer$>
<shape type="... shape type ...">
... $\code{shape}$ parameters ...
<shape type="... shape type ...">
... $\code{shape}$ parameters ...
<bsdf type="... bsdf type ...">
... $\code{bsdf}$ parameters ..
</bsdf>
<bsdf type="... bsdf type ...">
... $\code{bsdf}$ parameters ..
</bsdf>
<medium name="interior" type="... medium type ...">
... $\code{medium}$ parameters ...
</medium>
<medium name="interior" type="... medium type ...">
... $\code{medium}$ parameters ...
</medium>
<medium name="exterior" type="... medium type ...">
... $\code{medium}$ parameters ...
</medium>
<medium name="exterior" type="... medium type ...">
... $\code{medium}$ parameters ...
</medium>
<!-- Alternatively: reference named media and BSDF
instances that have been declared previously
<!-- Alternatively: reference named media and BSDF
instances that have been declared previously
<ref id="myBSDF"/>
<ref name="interior" id="myMedium1"/>
<ref name="exterior" id="myMedium2"/>
-->
</shape>
-->
</shape>
</scene>
\end{xml}
This constitutes the standard ways in which a shape can be declared.