replaced some tabs, added copyable spaces to lstlistings

metadata
Wenzel Jakob 2014-02-06 15:11:30 +01:00
parent ba2a6dcaf7
commit 093050f755
17 changed files with 591 additions and 555 deletions

View File

@ -101,40 +101,40 @@ $\texttt{\$}$ mitsuba -c machine:1234 path-to/my-scene.xml
When no port is explicitly specified, Mitsuba uses default value of 7554. When no port is explicitly specified, Mitsuba uses default value of 7554.
\item \textbf{SSH}: \item \textbf{SSH}:
This approach works as follows: The renderer creates a SSH connection This approach works as follows: The renderer creates a SSH connection
to the remote side, where it launches a Mitsuba worker instance. to the remote side, where it launches a Mitsuba worker instance.
All subsequent communication then passes through the encrypted link. All subsequent communication then passes through the encrypted link.
This is completely secure but slower due to the encryption overhead. This is completely secure but slower due to the encryption overhead.
If you are rendering a complex scene, there is a good chance that it If you are rendering a complex scene, there is a good chance that it
won't matter much since most time is spent doing computations rather than won't matter much since most time is spent doing computations rather than
communicating communicating
Such an SSH link can be created simply by using a slightly different syntax: Such an SSH link can be created simply by using a slightly different syntax:
\begin{shell} \begin{shell}
$\texttt{\$}$ mitsuba -c username@machine path-to/my-scene.xml $\texttt{\$}$ mitsuba -c username@machine path-to/my-scene.xml
\end{shell} \end{shell}
The above line assumes that the remote home directory contains The above line assumes that the remote home directory contains
a Mitsuba source directory named \code{mitsuba}, a Mitsuba source directory named \code{mitsuba},
which contains the compiled Mitsuba binaries. which contains the compiled Mitsuba binaries.
If that is not the case, you need to provide the path to such a directory manually, e.g: If that is not the case, you need to provide the path to such a directory manually, e.g:
\begin{shell} \begin{shell}
$\texttt{\$}$ mitsuba -c username@machine:/opt/mitsuba path-to/my-scene.xml $\texttt{\$}$ mitsuba -c username@machine:/opt/mitsuba path-to/my-scene.xml
\end{shell} \end{shell}
For the SSH connection approach to work, you \emph{must} enable passwordless For the SSH connection approach to work, you \emph{must} enable passwordless
authentication. authentication.
Try opening a terminal window and running the command \code{ssh username@machine} Try opening a terminal window and running the command \code{ssh username@machine}
(replace with the details of your remote connection). (replace with the details of your remote connection).
If you are asked for a password, something is not set up correctly --- please see If you are asked for a password, something is not set up correctly --- please see
\url{http://www.debian-administration.org/articles/152} for instructions. \url{http://www.debian-administration.org/articles/152} for instructions.
On Windows, the situation is a bit more difficult since there is no suitable SSH client by On Windows, the situation is a bit more difficult since there is no suitable SSH client by
default. To get SSH connections to work, Mitsuba requires \code{plink.exe} (from PuTTY) to default. To get SSH connections to work, Mitsuba requires \code{plink.exe} (from PuTTY) to
be on the path. For passwordless authentication with a Linux/OSX-based be on the path. For passwordless authentication with a Linux/OSX-based
server, convert your private key to PuTTY's format using \code{puttygen.exe}. server, convert your private key to PuTTY's format using \code{puttygen.exe}.
Afterwards, start \code{pageant.exe} to load and authenticate the key. All Afterwards, start \code{pageant.exe} to load and authenticate the key. All
of these binaries are available from the PuTTY website. of these binaries are available from the PuTTY website.
It is possible to mix the two approaches to access some machines directly and others It is possible to mix the two approaches to access some machines directly and others
over SSH. over SSH.
\end{itemize} \end{itemize}
When doing many network-based renders over the command line, it can become tedious to When doing many network-based renders over the command line, it can become tedious to
specify the connections every time. They can alternatively be loaded from a text file specify the connections every time. They can alternatively be loaded from a text file
@ -155,7 +155,7 @@ For instance, you can render a scene several times with different reflectance va
on a certain material by changing its description to something like on a certain material by changing its description to something like
\begin{xml} \begin{xml}
<bsdf type="diffuse"> <bsdf type="diffuse">
<spectrum name="reflectance" value="$\texttt{\$}$reflectance"/> <spectrum name="reflectance" value="$\texttt{\$}$reflectance"/>
</bsdf> </bsdf>
\end{xml} \end{xml}
and running Mitsuba as follows: and running Mitsuba as follows:

View File

@ -322,10 +322,10 @@ configuration file from the \texttt{build} directory.
\subsection{Building on Mac OS X} \subsection{Building on Mac OS X}
\vspace{-5mm} \vspace{-5mm}
\remarks{ \remarks{
\item Unfortunately, OpenMP is not available when compiling \item Unfortunately, OpenMP is not available when compiling
using the regular \code{clang} toolchain (it is available when using Intel XE Composer). This will cause the following parts of Mitsuba using the regular \code{clang} toolchain (it is available when using Intel XE Composer). This will cause the following parts of Mitsuba
to run single-threaded: bitmap resampling (i.e. MIP map generation), blue noise point generation in the \pluginref{dipole} to run single-threaded: bitmap resampling (i.e. MIP map generation), blue noise point generation in the \pluginref{dipole}
plugin, as well as the \pluginref{ppm} and \pluginref{sppm} plugins. plugin, as well as the \pluginref{ppm} and \pluginref{sppm} plugins.
} }
Compiling Mitsuba's dependencies on Mac OS is a laborious process; for convenience, there Compiling Mitsuba's dependencies on Mac OS is a laborious process; for convenience, there
is a repository that provides them in precompiled form. To use this repository, clone it is a repository that provides them in precompiled form. To use this repository, clone it

View File

@ -10,15 +10,15 @@ to become part of the main codebase.
Mitsuba is split into four basic support libraries: Mitsuba is split into four basic support libraries:
\begin{itemize} \begin{itemize}
\item The core library (\code{libcore}) implements basic functionality such as \item The core library (\code{libcore}) implements basic functionality such as
cross-platform file and bitmap I/O, data structures, scheduling, as well as logging and plugin management. cross-platform file and bitmap I/O, data structures, scheduling, as well as logging and plugin management.
\item The rendering library (\code{librender}) contains abstractions \item The rendering library (\code{librender}) contains abstractions
needed to load and represent scenes containing light sources, shapes, materials, and participating media. needed to load and represent scenes containing light sources, shapes, materials, and participating media.
\item The hardware acceleration library (\code{libhw}) \item The hardware acceleration library (\code{libhw})
implements a cross-platform display library, an object-oriented OpenGL implements a cross-platform display library, an object-oriented OpenGL
wrapper, as well as support for rendering interactive previews of scenes. wrapper, as well as support for rendering interactive previews of scenes.
\item Finally, the bidirectional library (\code{libbidir}) \item Finally, the bidirectional library (\code{libbidir})
contains a support layer that is used to implement bidirectional rendering algorithms such as contains a support layer that is used to implement bidirectional rendering algorithms such as
Bidirectional Path Tracing and Metropolis Light Transport. Bidirectional Path Tracing and Metropolis Light Transport.
\end{itemize} \end{itemize}
A detailed reference of these APIs is available at A detailed reference of these APIs is available at
\url{http://www.mitsuba-renderer.org/api}. The next sections \url{http://www.mitsuba-renderer.org/api}. The next sections
@ -33,18 +33,18 @@ this way, otherwise the source code layout will look garbled.
same line to make the best use of vertical space, i.e. same line to make the best use of vertical space, i.e.
\begin{cpp} \begin{cpp}
if (x > y) { if (x > y) {
x = y; x = y;
} }
\end{cpp} \end{cpp}
\paragraph{Placement of spaces:} Placement of spaces follows K\&R, e.g. \paragraph{Placement of spaces:} Placement of spaces follows K\&R, e.g.
\begin{cpp} \begin{cpp}
if (x == y) { if (x == y) {
.. ..
} else if (x > y) { } else if (x > y) {
.. ..
} else { } else {
.. ..
} }
\end{cpp} \end{cpp}
rather than things like this rather than things like this
@ -61,12 +61,12 @@ have the prefix \code{m\_}. Here is an example:
\begin{cpp} \begin{cpp}
class MyClass { class MyClass {
public: public:
MyClass(int value) : m_value(value) { } MyClass(int value) : m_value(value) { }
inline void setValue(int value) { m_value = value; } inline void setValue(int value) { m_value = value; }
inline int getValue() const { return m_value; } inline int getValue() const { return m_value; }
private: private:
int m_value; int m_value;
}; };
\end{cpp} \end{cpp}
@ -74,9 +74,9 @@ private:
start with a capital \textbf{E}, e.g. start with a capital \textbf{E}, e.g.
\begin{cpp} \begin{cpp}
enum ETristate { enum ETristate {
ENo = 0, ENo = 0,
EYes, EYes,
EMaybe EMaybe
}; };
\end{cpp} \end{cpp}
\paragraph{Constant methods and parameters:} Declare member functions and \paragraph{Constant methods and parameters:} Declare member functions and
@ -102,8 +102,8 @@ counting. This is done using the \code{ref<>} template, e.g.
\begin{cpp} \begin{cpp}
if (..) { if (..) {
ref<MyClass> instance = new MyClass(); ref<MyClass> instance = new MyClass();
instance->doSomething() instance->doSomething()
} // reference expires, instance will be deallocated } // reference expires, instance will be deallocated
\end{cpp} \end{cpp}

View File

@ -12,9 +12,9 @@ something like this:
\begin{xml} \begin{xml}
<?xml version="1.0" encoding="utf-8"?> <?xml version="1.0" encoding="utf-8"?>
<scene version=$\MtsVer$> <scene version=$\MtsVer$>
<shape type="obj"> <shape type="obj">
<string name="filename" value="dragon.obj"/> <string name="filename" value="dragon.obj"/>
</shape> </shape>
</scene> </scene>
\end{xml} \end{xml}
The scene version attribute denotes the release of Mitsuba that was used to The scene version attribute denotes the release of Mitsuba that was used to
@ -35,9 +35,9 @@ Similarly, you could write
\begin{xml} \begin{xml}
<?xml version="1.0" encoding="utf-8"?> <?xml version="1.0" encoding="utf-8"?>
<scene version=$\MtsVer$> <scene version=$\MtsVer$>
<shape type="sphere"> <shape type="sphere">
<float name="radius" value="10"/> <float name="radius" value="10"/>
</shape> </shape>
</scene> </scene>
\end{xml} \end{xml}
This loads a different plugin (\code{sphere}) which is still a \emph{Shape}, but instead represents This loads a different plugin (\code{sphere}) which is still a \emph{Shape}, but instead represents
@ -51,55 +51,55 @@ and one or more emitters. Here is a more complex example:
<?xml version="1.0" encoding="utf-8"?> <?xml version="1.0" encoding="utf-8"?>
<scene version=$\MtsVer$> <scene version=$\MtsVer$>
<integrator type="path"> <integrator type="path">
<!-- Path trace with a max. path length of 8 --> <!-- Path trace with a max. path length of 8 -->
<integer name="maxDepth" value="8"/> <integer name="maxDepth" value="8"/>
</integrator> </integrator>
<!-- Instantiate a perspective camera with 45 degrees field of view --> <!-- Instantiate a perspective camera with 45 degrees field of view -->
<sensor type="perspective"> <sensor type="perspective">
<!-- Rotate the camera around the Y axis by 180 degrees --> <!-- Rotate the camera around the Y axis by 180 degrees -->
<transform name="toWorld"> <transform name="toWorld">
<rotate y="1" angle="180"/> <rotate y="1" angle="180"/>
</transform> </transform>
<float name="fov" value="45"/> <float name="fov" value="45"/>
<!-- Render with 32 samples per pixel using a basic <!-- Render with 32 samples per pixel using a basic
independent sampling strategy --> independent sampling strategy -->
<sampler type="independent"> <sampler type="independent">
<integer name="sampleCount" value="32"/> <integer name="sampleCount" value="32"/>
</sampler> </sampler>
<!-- Generate an EXR image at HD resolution --> <!-- Generate an EXR image at HD resolution -->
<film type="hdrfilm"> <film type="hdrfilm">
<integer name="width" value="1920"/> <integer name="width" value="1920"/>
<integer name="height" value="1080"/> <integer name="height" value="1080"/>
</film> </film>
</sensor> </sensor>
<!-- Add a dragon mesh made of rough glass (stored as OBJ file) --> <!-- Add a dragon mesh made of rough glass (stored as OBJ file) -->
<shape type="obj"> <shape type="obj">
<string name="filename" value="dragon.obj"/> <string name="filename" value="dragon.obj"/>
<bsdf type="roughdielectric"> <bsdf type="roughdielectric">
<!-- Tweak the roughness parameter of the material --> <!-- Tweak the roughness parameter of the material -->
<float name="alpha" value="0.01"/> <float name="alpha" value="0.01"/>
</bsdf> </bsdf>
</shape> </shape>
<!-- Add another mesh -- this time, stored using Mitsuba's own <!-- Add another mesh -- this time, stored using Mitsuba's own
(compact) binary representation --> (compact) binary representation -->
<shape type="serialized"> <shape type="serialized">
<string name="filename" value="lightsource.serialized"/> <string name="filename" value="lightsource.serialized"/>
<transform name="toWorld"> <transform name="toWorld">
<translate x="5" y="-3" z="1"/> <translate x="5" y="-3" z="1"/>
</transform> </transform>
<!-- This mesh is an area emitter --> <!-- This mesh is an area emitter -->
<emitter type="area"> <emitter type="area">
<rgb name="radiance" value="100,400,100"/> <rgb name="radiance" value="100,400,100"/>
</emitter> </emitter>
</shape> </shape>
</scene> </scene>
\end{xml} \end{xml}
This example introduces several new object types (\code{integrator, sensor, bsdf, sampler, film}, and \code{emitter}) This example introduces several new object types (\code{integrator, sensor, bsdf, sampler, film}, and \code{emitter})
@ -211,10 +211,10 @@ are allowed. Here is an example:
\end{xml} \end{xml}
\renderings{ \renderings{
\fbox{\includegraphics[width=10cm]{images/blackbody}} \fbox{\includegraphics[width=10cm]{images/blackbody}}
\hfill\, \hfill\,
\caption{\label{fig:blackbody}A few simulated \caption{\label{fig:blackbody}A few simulated
black body emitters over a range of temperature values} black body emitters over a range of temperature values}
} }
\label{sec:blackbody} \label{sec:blackbody}
Finally, it is also possible to specify the spectral distribution of a black body emitter (\figref{blackbody}), Finally, it is also possible to specify the spectral distribution of a black body emitter (\figref{blackbody}),
@ -252,8 +252,8 @@ with the identity, one can build up a transformation using a sequence of command
does a translation followed by a rotation might be written like this: does a translation followed by a rotation might be written like this:
\begin{xml} \begin{xml}
<transform name="trafoProperty"> <transform name="trafoProperty">
<translate x="-1" y="3" z="4"/> <translate x="-1" y="3" z="4"/>
<rotate y="1" angle="45"/> <rotate y="1" angle="45"/>
</transform> </transform>
\end{xml} \end{xml}
Mathematically, each incremental transformation in the sequence is left-multiplied onto the current one. The following Mathematically, each incremental transformation in the sequence is left-multiplied onto the current one. The following
@ -323,22 +323,22 @@ to declare it over and over again, which wastes memory, you can make use of refe
of how this works: of how this works:
\begin{xml} \begin{xml}
<scene version=$\MtsVer$> <scene version=$\MtsVer$>
<texture type="bitmap" id="myImage"> <texture type="bitmap" id="myImage">
<string name="filename" value="textures/myImage.jpg"/> <string name="filename" value="textures/myImage.jpg"/>
</texture> </texture>
<bsdf type="diffuse" id="myMaterial"> <bsdf type="diffuse" id="myMaterial">
<!-- Reference the texture named myImage and pass it <!-- Reference the texture named myImage and pass it
to the BRDF as the reflectance parameter --> to the BRDF as the reflectance parameter -->
<ref name="reflectance" id="myImage"/> <ref name="reflectance" id="myImage"/>
</bsdf> </bsdf>
<shape type="obj"> <shape type="obj">
<string name="filename" value="meshes/myShape.obj"/> <string name="filename" value="meshes/myShape.obj"/>
<!-- Reference the material named myMaterial --> <!-- Reference the material named myMaterial -->
<ref id="myMaterial"/> <ref id="myMaterial"/>
</shape> </shape>
</scene> </scene>
\end{xml} \end{xml}
By providing a unique \texttt{id} attribute in the By providing a unique \texttt{id} attribute in the

View File

@ -40,7 +40,7 @@ MTS_NAMESPACE_BEGIN
class MyIntegrator : public SamplingIntegrator { class MyIntegrator : public SamplingIntegrator {
public: public:
MTS_DECLARE_CLASS() MTS_DECLARE_CLASS()
}; };
MTS_IMPLEMENT_CLASS_S(MyIntegrator, false, SamplingIntegrator) MTS_IMPLEMENT_CLASS_S(MyIntegrator, false, SamplingIntegrator)
@ -87,7 +87,7 @@ public:
} }
private: private:
Spectrum m_color; Spectrum m_color;
\end{cpp} \end{cpp}
This code fragment sets up a default color (a light shade of green), which This code fragment sets up a default color (a light shade of green), which
@ -96,7 +96,7 @@ the integrator from an XML document like this
\begin{xml} \begin{xml}
<integrator type="myIntegrator"> <integrator type="myIntegrator">
<spectrum name="color" value="1.0"/> <spectrum name="color" value="1.0"/>
</integrator> </integrator>
\end{xml} \end{xml}
in which case white would take preference. in which case white would take preference.
@ -189,11 +189,11 @@ substituted based the compilation flags. This variable constitutes local
state, thus it must not be forgotten in the serialization- and unserialization routines: state, thus it must not be forgotten in the serialization- and unserialization routines:
append append
\begin{cpp} \begin{cpp}
m_maxDist = stream->readFloat(); m_maxDist = stream->readFloat();
\end{cpp} \end{cpp}
and and
\begin{cpp} \begin{cpp}
stream->writeFloat(m_maxDist); stream->writeFloat(m_maxDist);
\end{cpp} \end{cpp}
to the unserialization constructor and the \code{serialize} method, respectively. to the unserialization constructor and the \code{serialize} method, respectively.
@ -202,24 +202,24 @@ distance to all corners of the bounding box, which encloses the scene.
To avoid having to do this every time \code{Li()} is called, To avoid having to do this every time \code{Li()} is called,
we can override the \code{preprocess} function: we can override the \code{preprocess} function:
\begin{cpp} \begin{cpp}
/// Preprocess function -- called on the initiating machine /// Preprocess function -- called on the initiating machine
bool preprocess(const Scene *scene, RenderQueue *queue, bool preprocess(const Scene *scene, RenderQueue *queue,
const RenderJob *job, int sceneResID, int cameraResID, const RenderJob *job, int sceneResID, int cameraResID,
int samplerResID) { int samplerResID) {
SamplingIntegrator::preprocess(scene, queue, job, sceneResID, SamplingIntegrator::preprocess(scene, queue, job, sceneResID,
cameraResID, samplerResID); cameraResID, samplerResID);
const AABB &sceneAABB = scene->getAABB(); const AABB &sceneAABB = scene->getAABB();
/* Find the camera position at t=0 seconds */ /* Find the camera position at t=0 seconds */
Point cameraPosition = scene->getSensor()->getWorldTransform()->eval(0).transformAffine(Point(0.0f)); Point cameraPosition = scene->getSensor()->getWorldTransform()->eval(0).transformAffine(Point(0.0f));
m_maxDist = - std::numeric_limits<Float>::infinity(); m_maxDist = - std::numeric_limits<Float>::infinity();
for (int i=0; i<8; ++i) for (int i=0; i<8; ++i)
m_maxDist = std::max(m_maxDist, m_maxDist = std::max(m_maxDist,
(cameraPosition - sceneAABB.getCorner(i)).length()); (cameraPosition - sceneAABB.getCorner(i)).length());
return true; return true;
} }
\end{cpp} \end{cpp}
The bottom of this function should be relatively self-explanatory. The The bottom of this function should be relatively self-explanatory. The
numerous arguments at the top are related to the parallelization layer, which will be numerous arguments at the top are related to the parallelization layer, which will be
@ -238,11 +238,11 @@ other nodes before the rendering begins.
Now, replace the body of the \code{Li} method with Now, replace the body of the \code{Li} method with
\begin{cpp} \begin{cpp}
if (rRec.rayIntersect(r)) { if (rRec.rayIntersect(r)) {
Float distance = rRec.its.t; Float distance = rRec.its.t;
return Spectrum(1.0f - distance/m_maxDist) * m_color; return Spectrum(1.0f - distance/m_maxDist) * m_color;
} }
return Spectrum(0.0f); return Spectrum(0.0f);
\end{cpp} \end{cpp}
and the distance renderer is done! and the distance renderer is done!
\begin{center} \begin{center}
@ -251,11 +251,11 @@ and the distance renderer is done!
There are a few more noteworthy details: first of all, the ``usual'' way There are a few more noteworthy details: first of all, the ``usual'' way
to intersect a ray against the scene actually works like this: to intersect a ray against the scene actually works like this:
\begin{cpp} \begin{cpp}
Intersection its; Intersection its;
Ray ray = ...; Ray ray = ...;
if (scene->rayIntersect(ray, its)) { if (scene->rayIntersect(ray, its)) {
/* Do something with the intersection stored in 'its' */ /* Do something with the intersection stored in 'its' */
} }
\end{cpp} \end{cpp}
As you can see, we did something slightly different in the distance As you can see, we did something slightly different in the distance
renderer fragment above (we called \code{RadianceQueryRecord::rayIntersect()} renderer fragment above (we called \code{RadianceQueryRecord::rayIntersect()}
@ -269,11 +269,11 @@ into a scene XML file:
\begin{xml} \begin{xml}
<!-- Adaptively integrate using the nested technique --> <!-- Adaptively integrate using the nested technique -->
<integrator type="adaptive"> <integrator type="adaptive">
<!-- Irradiance caching + final gathering with the nested technique --> <!-- Irradiance caching + final gathering with the nested technique -->
<integrator type="irrcache"> <integrator type="irrcache">
<!-- Simple direct illumination technique --> <!-- Simple direct illumination technique -->
<integrator type="direct"> <integrator type="direct">
</integrator> </integrator>
</integrator> </integrator>
\end{xml} \end{xml}
To support this kind of complex interaction, some information needs to be passed between the To support this kind of complex interaction, some information needs to be passed between the
@ -294,16 +294,16 @@ as possible. Your overall code might for example be structured like this:
\begin{cpp} \begin{cpp}
Spectrum Li(const RayDifferential &r, RadianceQueryRecord &rRec) const { Spectrum Li(const RayDifferential &r, RadianceQueryRecord &rRec) const {
Spectrum result; Spectrum result;
if (rRec.type & RadianceQueryRecord::EEmittedRadiance) { if (rRec.type & RadianceQueryRecord::EEmittedRadiance) {
// Emitted surface radiance contribution was requested // Emitted surface radiance contribution was requested
result += ...; result += ...;
} }
if (rRec.type & RadianceQueryRecord::EDirectRadiance) { if (rRec.type & RadianceQueryRecord::EDirectRadiance) {
// Direct illumination contribution was requested // Direct illumination contribution was requested
result += ...; result += ...;
} }
... ...
return result; return result;
} }
\end{cpp} \end{cpp}

View File

@ -91,8 +91,8 @@ Mitsuba is free software and can be redistributed and modified under the terms o
Public License (Version 3) as provided by the Free Software Foundation. Public License (Version 3) as provided by the Free Software Foundation.
\remarks{ \remarks{
\item Being a ``viral'' license, the GPL automatically applies to all \item Being a ``viral'' license, the GPL automatically applies to all
derivative work. Amongst other things, this means that without express derivative work. Amongst other things, this means that without express
permission, Mitsuba's source code is \emph{off-limits} to companies that permission, Mitsuba's source code is \emph{off-limits} to companies that
develop rendering software not distributed under a compatible license. develop rendering software not distributed under a compatible license.
} }

View File

@ -61,19 +61,19 @@
\pagestyle{scrheadings} \pagestyle{scrheadings}
\usepackage[ \usepackage[
bookmarks, bookmarks,
bookmarksnumbered, bookmarksnumbered,
colorlinks, colorlinks,
plainpages=false, plainpages=false,
pdfpagelabels, pdfpagelabels,
hypertexnames=false, hypertexnames=false,
linkcolor=myblue, linkcolor=myblue,
urlcolor=myblue, urlcolor=myblue,
citecolor=myblue, citecolor=myblue,
pdfpagelabels, pdfpagelabels,
pdftitle={Mitsuba \MitsubaVersion\, Documentation}, pdftitle={Mitsuba \MitsubaVersion\, Documentation},
pdfauthor={Wenzel Jakob}, pdfauthor={Wenzel Jakob},
pdfstartview=FitH pdfstartview=FitH
]{hyperref} ]{hyperref}
\definecolor{myblue}{rgb}{0,.1,.6} \definecolor{myblue}{rgb}{0,.1,.6}
@ -85,40 +85,47 @@
\definecolor{remark}{rgb}{1.0, 0.9, 0.9} \definecolor{remark}{rgb}{1.0, 0.9, 0.9}
\definecolor{remarkframe}{rgb}{1.0, 0.7, 0.7} \definecolor{remarkframe}{rgb}{1.0, 0.7, 0.7}
% requires the latest version of package accsupp
\usepackage[space=true]{accsupp}
\newcommand{\copyablespace}{\BeginAccSupp{method=hex,unicode,ActualText=00A0}\ \EndAccSupp{}}
% Listings settings % Listings settings
\lstset{ \lstset{
basicstyle = \small\ttfamily\raggedright, basicstyle = \small\ttfamily\raggedright,
commentstyle=\color{lstcomment}\itshape, commentstyle=\color{lstcomment}\itshape,
stringstyle=\color{lstattrib}, stringstyle=\color{lstattrib},
mathescape = true, mathescape = true,
frame = lrtb, frame = lrtb,
backgroundcolor = \color{lstshade}, backgroundcolor = \color{lstshade},
rulecolor = \color{lstframe}, rulecolor = \color{lstframe},
tabsize = 4, tabsize = 4,
columns = flexible, columns = fullflexible,
keepspaces, keepspaces,
belowskip = \smallskipamount, belowskip = \smallskipamount,
framerule = .7pt, framerule = .7pt,
breaklines = true, breaklines = true,
showstringspaces = false, showstringspaces = false,
keywordstyle = \bfseries, keywordstyle = \bfseries,
captionpos = b, captionpos = b,
upquote = true upquote = true,
literate={*}{{\char42}}1
{-}{{\char45}}1
{\ }{{\copyablespace}}1
} }
\lstdefinelanguage{xml} { \lstdefinelanguage{xml} {
sensitive=true, sensitive=true,
morecomment=[s][\color{lstcomment}\itshape]{<!--}{-->}, morecomment=[s][\color{lstcomment}\itshape]{<!--}{-->},
morecomment=[s][\color{lstcomment}]{<?}{?>}, morecomment=[s][\color{lstcomment}]{<?}{?>},
string=[b]", stringstyle=\color{lstattrib}, string=[b]", stringstyle=\color{lstattrib},
keywords= [1] { keywords= [1] {
shape,bsdf,scene,texture,phase,integer,float, shape,bsdf,scene,texture,phase,integer,float,
string,transform,ref,rgb,srgb,spectrum,blackbody, string,transform,ref,rgb,srgb,spectrum,blackbody,
medium,film,sampler,integrator,emitter,sensor, medium,film,sampler,integrator,emitter,sensor,
translate,rotate,scale,lookat,point,vector,matrix, translate,rotate,scale,lookat,point,vector,matrix,
include,fscat,volume,alias,rfilter,boolean, include,fscat,volume,alias,rfilter,boolean,
subsurface,animation subsurface,animation
}, },
} }
@ -133,25 +140,25 @@
\setlength{\intextsep}{3pt} \setlength{\intextsep}{3pt}
\lstnewenvironment{shell}[1][]{\lstset{#1}} \lstnewenvironment{shell}[1][]{\lstset{#1}}
{} {}
\lstnewenvironment{cpp}[1][]{\lstset{language=c++, #1}} \lstnewenvironment{cpp}[1][]{\lstset{language=c++, #1}}
{} {}
\lstnewenvironment{python}[1][]{\lstset{language=Python, #1}} \lstnewenvironment{python}[1][]{\lstset{language=Python, #1}}
{} {}
\lstnewenvironment{xml}[1][]{\lstset{language=xml, #1}} \lstnewenvironment{xml}[1][]{\lstset{language=xml, #1}}
{} {}
\lstnewenvironment{console}[1][]{\lstset{basicstyle=\footnotesize\ttfamily, float, #1}} \lstnewenvironment{console}[1][]{\lstset{basicstyle=\footnotesize\ttfamily, float, #1}}
{} {}
% ----- 8< ----- 8< ------ % ----- 8< ----- 8< ------
\title{ \title{
\vspace{3cm} \vspace{3cm}
\includegraphics[width=4cm]{images/logo_plain.pdf}\\\vspace{1.5cm} \includegraphics[width=4cm]{images/logo_plain.pdf}\\\vspace{1.5cm}
\Huge \Huge
Mitsuba Documentation\\\vspace{4mm} Mitsuba Documentation\\\vspace{4mm}
\LARGE Version \MitsubaVersion \LARGE Version \MitsubaVersion
\vspace{5mm} \vspace{5mm}
} }
\author{Wenzel Jakob} \author{Wenzel Jakob}
\date{\today} \date{\today}

View File

@ -36,12 +36,12 @@ MTS_NAMESPACE_BEGIN
class ROT13Encoder : public Utility { class ROT13Encoder : public Utility {
public: public:
int run(int argc, char **argv) { int run(int argc, char **argv) {
cout << "Hello world!" << endl; cout << "Hello world!" << endl;
return 0; return 0;
} }
MTS_DECLARE_UTILITY() MTS_DECLARE_UTILITY()
}; };
MTS_EXPORT_UTILITY(ROT13Encoder, "Perform a ROT13 encryption of a string") MTS_EXPORT_UTILITY(ROT13Encoder, "Perform a ROT13 encryption of a string")
@ -58,8 +58,8 @@ $\texttt{\$}$ mtsutil
.. ..
The following utilities are available: The following utilities are available:
addimages Generate linear combinations of EXR images addimages Generate linear combinations of EXR images
rot13 Perform a ROT13 encryption of a string rot13 Perform a ROT13 encryption of a string
\end{shell} \end{shell}
It can be executed as follows: It can be executed as follows:
\begin{shell} \begin{shell}
@ -82,22 +82,22 @@ For reference, here are the interfaces of \code{WorkUnit} and \code{WorkResult}:
*/ */
class MTS_EXPORT_CORE WorkUnit : public Object { class MTS_EXPORT_CORE WorkUnit : public Object {
public: public:
/// Copy the content of another work unit of the same type /// Copy the content of another work unit of the same type
virtual void set(const WorkUnit *workUnit) = 0; virtual void set(const WorkUnit *workUnit) = 0;
/// Fill the work unit with content acquired from a binary data stream /// Fill the work unit with content acquired from a binary data stream
virtual void load(Stream *stream) = 0; virtual void load(Stream *stream) = 0;
/// Serialize a work unit to a binary data stream /// Serialize a work unit to a binary data stream
virtual void save(Stream *stream) const = 0; virtual void save(Stream *stream) const = 0;
/// Return a string representation /// Return a string representation
virtual std::string toString() const = 0; virtual std::string toString() const = 0;
MTS_DECLARE_CLASS() MTS_DECLARE_CLASS()
protected: protected:
/// Virtual destructor /// Virtual destructor
virtual ~WorkUnit() { } virtual ~WorkUnit() { }
}; };
/** /**
* Abstract work result. Represents the information that encodes * Abstract work result. Represents the information that encodes
@ -105,60 +105,60 @@ protected:
*/ */
class MTS_EXPORT_CORE WorkResult : public Object { class MTS_EXPORT_CORE WorkResult : public Object {
public: public:
/// Fill the work result with content acquired from a binary data stream /// Fill the work result with content acquired from a binary data stream
virtual void load(Stream *stream) = 0; virtual void load(Stream *stream) = 0;
/// Serialize a work result to a binary data stream /// Serialize a work result to a binary data stream
virtual void save(Stream *stream) const = 0; virtual void save(Stream *stream) const = 0;
/// Return a string representation /// Return a string representation
virtual std::string toString() const = 0; virtual std::string toString() const = 0;
MTS_DECLARE_CLASS() MTS_DECLARE_CLASS()
protected: protected:
/// Virtual destructor /// Virtual destructor
virtual ~WorkResult() { } virtual ~WorkResult() { }
}; };
\end{cpp} \end{cpp}
In our case, the \code{WorkUnit} implementation then looks like this: In our case, the \code{WorkUnit} implementation then looks like this:
\begin{cpp} \begin{cpp}
class ROT13WorkUnit : public WorkUnit { class ROT13WorkUnit : public WorkUnit {
public: public:
void set(const WorkUnit *workUnit) { void set(const WorkUnit *workUnit) {
const ROT13WorkUnit *wu = const ROT13WorkUnit *wu =
static_cast<const ROT13WorkUnit *>(workUnit); static_cast<const ROT13WorkUnit *>(workUnit);
m_char = wu->m_char; m_char = wu->m_char;
m_pos = wu->m_pos; m_pos = wu->m_pos;
} }
void load(Stream *stream) { void load(Stream *stream) {
m_char = stream->readChar(); m_char = stream->readChar();
m_pos = stream->readInt(); m_pos = stream->readInt();
} }
void save(Stream *stream) const { void save(Stream *stream) const {
stream->writeChar(m_char); stream->writeChar(m_char);
stream->writeInt(m_pos); stream->writeInt(m_pos);
} }
std::string toString() const { std::string toString() const {
std::ostringstream oss; std::ostringstream oss;
oss << "ROT13WorkUnit[" << endl oss << "ROT13WorkUnit[" << endl
<< " char = '" << m_char << "'," << endl << " char = '" << m_char << "'," << endl
<< " pos = " << m_pos << endl << " pos = " << m_pos << endl
<< "]"; << "]";
return oss.str(); return oss.str();
} }
inline char getChar() const { return m_char; } inline char getChar() const { return m_char; }
inline void setChar(char value) { m_char = value; } inline void setChar(char value) { m_char = value; }
inline int getPos() const { return m_pos; } inline int getPos() const { return m_pos; }
inline void setPos(int value) { m_pos = value; } inline void setPos(int value) { m_pos = value; }
MTS_DECLARE_CLASS() MTS_DECLARE_CLASS()
private: private:
char m_char; char m_char;
int m_pos; int m_pos;
}; };
MTS_IMPLEMENT_CLASS(ROT13WorkUnit, false, WorkUnit) MTS_IMPLEMENT_CLASS(ROT13WorkUnit, false, WorkUnit)
@ -175,42 +175,42 @@ remote worker nodes and replicated amongst local threads.
\begin{cpp} \begin{cpp}
class ROT13WorkProcessor : public WorkProcessor { class ROT13WorkProcessor : public WorkProcessor {
public: public:
/// Construct a new work processor /// Construct a new work processor
ROT13WorkProcessor() : WorkProcessor() { } ROT13WorkProcessor() : WorkProcessor() { }
/// Unserialize from a binary data stream (nothing to do in our case) /// Unserialize from a binary data stream (nothing to do in our case)
ROT13WorkProcessor(Stream *stream, InstanceManager *manager) ROT13WorkProcessor(Stream *stream, InstanceManager *manager)
: WorkProcessor(stream, manager) { } : WorkProcessor(stream, manager) { }
/// Serialize to a binary data stream (nothing to do in our case) /// Serialize to a binary data stream (nothing to do in our case)
void serialize(Stream *stream, InstanceManager *manager) const { void serialize(Stream *stream, InstanceManager *manager) const {
} }
ref<WorkUnit> createWorkUnit() const { ref<WorkUnit> createWorkUnit() const {
return new ROT13WorkUnit(); return new ROT13WorkUnit();
} }
ref<WorkResult> createWorkResult() const { ref<WorkResult> createWorkResult() const {
return new ROT13WorkResult(); return new ROT13WorkResult();
} }
ref<WorkProcessor> clone() const { ref<WorkProcessor> clone() const {
return new ROT13WorkProcessor(); // No state to clone in our case return new ROT13WorkProcessor(); // No state to clone in our case
} }
/// No internal state, thus no preparation is necessary /// No internal state, thus no preparation is necessary
void prepare() { } void prepare() { }
/// Do the actual computation /// Do the actual computation
void process(const WorkUnit *workUnit, WorkResult *workResult, void process(const WorkUnit *workUnit, WorkResult *workResult,
const bool &stop) { const bool &stop) {
const ROT13WorkUnit *wu const ROT13WorkUnit *wu
= static_cast<const ROT13WorkUnit *>(workUnit); = static_cast<const ROT13WorkUnit *>(workUnit);
ROT13WorkResult *wr = static_cast<ROT13WorkResult *>(workResult); ROT13WorkResult *wr = static_cast<ROT13WorkResult *>(workResult);
wr->setPos(wu->getPos()); wr->setPos(wu->getPos());
wr->setChar((std::toupper(wu->getChar()) - 'A' + 13) % 26 + 'A'); wr->setChar((std::toupper(wu->getChar()) - 'A' + 13) % 26 + 'A');
} }
MTS_DECLARE_CLASS() MTS_DECLARE_CLASS()
}; };
MTS_IMPLEMENT_CLASS_S(ROT13WorkProcessor, false, WorkProcessor) MTS_IMPLEMENT_CLASS_S(ROT13WorkProcessor, false, WorkProcessor)
\end{cpp} \end{cpp}
@ -226,48 +226,48 @@ implementation might look as follows:
\begin{cpp} \begin{cpp}
class ROT13Process : public ParallelProcess { class ROT13Process : public ParallelProcess {
public: public:
ROT13Process(const std::string &input) : m_input(input), m_pos(0) { ROT13Process(const std::string &input) : m_input(input), m_pos(0) {
m_output.resize(m_input.length()); m_output.resize(m_input.length());
} }
ref<WorkProcessor> createWorkProcessor() const { ref<WorkProcessor> createWorkProcessor() const {
return new ROT13WorkProcessor(); return new ROT13WorkProcessor();
} }
std::vector<std::string> getRequiredPlugins() { std::vector<std::string> getRequiredPlugins() {
std::vector<std::string> result; std::vector<std::string> result;
result.push_back("rot13"); result.push_back("rot13");
return result; return result;
} }
EStatus generateWork(WorkUnit *unit, int worker /* unused */) { EStatus generateWork(WorkUnit *unit, int worker /* unused */) {
if (m_pos >= (int) m_input.length()) if (m_pos >= (int) m_input.length())
return EFailure; return EFailure;
ROT13WorkUnit *wu = static_cast<ROT13WorkUnit *>(unit); ROT13WorkUnit *wu = static_cast<ROT13WorkUnit *>(unit);
wu->setPos(m_pos); wu->setPos(m_pos);
wu->setChar(m_input[m_pos++]); wu->setChar(m_input[m_pos++]);
return ESuccess; return ESuccess;
} }
void processResult(const WorkResult *result, bool cancelled) { void processResult(const WorkResult *result, bool cancelled) {
if (cancelled) // indicates a work unit, which was if (cancelled) // indicates a work unit, which was
return; // cancelled partly through its execution return; // cancelled partly through its execution
const ROT13WorkResult *wr = const ROT13WorkResult *wr =
static_cast<const ROT13WorkResult *>(result); static_cast<const ROT13WorkResult *>(result);
m_output[wr->getPos()] = wr->getChar(); m_output[wr->getPos()] = wr->getChar();
} }
inline const std::string &getOutput() { inline const std::string &getOutput() {
return m_output; return m_output;
} }
MTS_DECLARE_CLASS() MTS_DECLARE_CLASS()
public: public:
std::string m_input; std::string m_input;
std::string m_output; std::string m_output;
int m_pos; int m_pos;
}; };
MTS_IMPLEMENT_CLASS(ROT13Process, false, ParallelProcess) MTS_IMPLEMENT_CLASS(ROT13Process, false, ParallelProcess)
\end{cpp} \end{cpp}
@ -281,25 +281,25 @@ loads the \code{ROT13*} classes at the right moment.
To actually use the \code{ROT13} encoder, we must first launch the newly created parallel process To actually use the \code{ROT13} encoder, we must first launch the newly created parallel process
from the main utility function (the `Hello World' code we wrote earlier). We can adapt it as follows: from the main utility function (the `Hello World' code we wrote earlier). We can adapt it as follows:
\begin{cpp} \begin{cpp}
int run(int argc, char **argv) { int run(int argc, char **argv) {
if (argc < 2) { if (argc < 2) {
cout << "Syntax: mtsutil rot13 <text>" << endl; cout << "Syntax: mtsutil rot13 <text>" << endl;
return -1; return -1;
} }
ref<ROT13Process> proc = new ROT13Process(argv[1]); ref<ROT13Process> proc = new ROT13Process(argv[1]);
ref<Scheduler> sched = Scheduler::getInstance(); ref<Scheduler> sched = Scheduler::getInstance();
/* Submit the encryption job to the scheduler */ /* Submit the encryption job to the scheduler */
sched->schedule(proc); sched->schedule(proc);
/* Wait for its completion */ /* Wait for its completion */
sched->wait(proc); sched->wait(proc);
cout << "Result: " << proc->getOutput() << endl; cout << "Result: " << proc->getOutput() << endl;
return 0; return 0;
} }
\end{cpp} \end{cpp}
After compiling everything using \code{scons}, a simple example After compiling everything using \code{scons}, a simple example
involving the utility would be to encode a string (e.g. \code{SECUREBYDESIGN}), while involving the utility would be to encode a string (e.g. \code{SECUREBYDESIGN}), while

View File

@ -10,21 +10,21 @@ The documentation of a plugin always starts on a new page and is preceded
by a table similar to the one below: by a table similar to the one below:
\parameters{ \parameters{
\parameter{softRays}{\Boolean}{ \parameter{softRays}{\Boolean}{
Try not to damage objects in the scene by shooting softer rays Try not to damage objects in the scene by shooting softer rays
\default{\code{false}} \default{\code{false}}
} }
\parameter{darkMatter}{\Float}{ \parameter{darkMatter}{\Float}{
Controls the proportionate amount of dark matter present in the scene. Controls the proportionate amount of dark matter present in the scene.
\default{0.83} \default{0.83}
} }
} }
Suppose this hypothetical plugin is an \emph{integrator} named \code{amazing}. Then, based on Suppose this hypothetical plugin is an \emph{integrator} named \code{amazing}. Then, based on
this description, it can be instantiated from an XML scene file using a custom configuration such as: this description, it can be instantiated from an XML scene file using a custom configuration such as:
\begin{xml} \begin{xml}
<integrator type="amazing"> <integrator type="amazing">
<boolean name="softerRays" value="true"/> <boolean name="softerRays" value="true"/>
<float name="darkMatter" value="0.4"/> <float name="darkMatter" value="0.4"/>
</integrator> </integrator>
\end{xml} \end{xml}
In some cases\footnote{Note that obvious parameters are generally omitted. In some cases\footnote{Note that obvious parameters are generally omitted.
@ -33,20 +33,20 @@ is left out from the documentation for brevity.}, plugins also indicate that the
as input arguments. These can either be \emph{named} or \emph{unnamed}. If as input arguments. These can either be \emph{named} or \emph{unnamed}. If
the \code{amazing} integrator also accepted the following two parameters\vspace{-2mm} the \code{amazing} integrator also accepted the following two parameters\vspace{-2mm}
\parameters{ \parameters{
\parameter{\Unnamed}{\Integrator}{A nested integrator which does the actual hard work} \parameter{\Unnamed}{\Integrator}{A nested integrator which does the actual hard work}
\parameter{puppies}{\Texture}{This must be used to supply a \mbox{cute picture of puppies}} \parameter{puppies}{\Texture}{This must be used to supply a \mbox{cute picture of puppies}}
} }
\vspace{-1mm} \vspace{-1mm}
then it can be instantiated e.g. as follows then it can be instantiated e.g. as follows
\begin{xml} \begin{xml}
<integrator type="amazing"> <integrator type="amazing">
<boolean name="softerRays" value="true"/> <boolean name="softerRays" value="true"/>
<float name="darkMatter" value="0.4"/> <float name="darkMatter" value="0.4"/>
<integrator type="path"/> <integrator type="path"/>
<texture name="puppies" type="bitmap"> <texture name="puppies" type="bitmap">
<string name="filename" value="cute.jpg"/> <string name="filename" value="cute.jpg"/>
</texture> </texture>
</integrator> </integrator>
\end{xml} \end{xml}
or, if these were already instantiated previously and are now or, if these were already instantiated previously and are now
@ -54,10 +54,10 @@ bound to the \emph{identifiers} (\secref{format}) \code{myPathTracer} and
\code{myTexture}, the following also works: \code{myTexture}, the following also works:
\begin{xml} \begin{xml}
<integrator type="amazing"> <integrator type="amazing">
<boolean name="softerRays" value="true"/> <boolean name="softerRays" value="true"/>
<float name="darkMatter" value="0.4"/> <float name="darkMatter" value="0.4"/>
<ref id="myPathTracer"/> <ref id="myPathTracer"/>
<ref name="puppies" id="myTexture"/> <ref name="puppies" id="myTexture"/>
</integrator> </integrator>
\end{xml} \end{xml}

View File

@ -153,7 +153,7 @@ scheduler = Scheduler.getInstance()
# Start up the scheduling system with one worker per local core # Start up the scheduling system with one worker per local core
for i in range(0, multiprocessing.cpu_count()): for i in range(0, multiprocessing.cpu_count()):
scheduler.registerWorker(LocalWorker(i, 'wrk%i' % i)) scheduler.registerWorker(LocalWorker(i, 'wrk%i' % i))
scheduler.start() scheduler.start()
# Create a queue for tracking render jobs # Create a queue for tracking render jobs
@ -204,9 +204,9 @@ pmgr = PluginManager.getInstance()
# Encodes parameters on how to instantiate the 'perspective' plugin # Encodes parameters on how to instantiate the 'perspective' plugin
sensorProps = Properties('perspective') sensorProps = Properties('perspective')
sensorProps['toWorld'] = Transform.lookAt( sensorProps['toWorld'] = Transform.lookAt(
Point(0, 0, -10), # Camera origin Point(0, 0, -10), # Camera origin
Point(0, 0, 0), # Camera target Point(0, 0, 0), # Camera target
Vector(0, 1, 0) # 'up' vector Vector(0, 1, 0) # 'up' vector
) )
sensorProps['fov'] = 45.0 sensorProps['fov'] = 45.0
@ -243,17 +243,17 @@ from mitsuba.core import *
pmgr = PluginManager.getInstance() pmgr = PluginManager.getInstance()
sensor = pmgr.create({ sensor = pmgr.create({
'type' : 'perspective', 'type' : 'perspective',
'toWorld' : Transform.lookAt( 'toWorld' : Transform.lookAt(
Point(0, 0, -10), Point(0, 0, -10),
Point(0, 0, 0), Point(0, 0, 0),
Vector(0, 1, 0) Vector(0, 1, 0)
), ),
'film' : { 'film' : {
'type' : 'ldrfilm', 'type' : 'ldrfilm',
'width' : 1920, 'width' : 1920,
'height' : 1080 'height' : 1080
} }
}) })
\end{python} \end{python}
This code does exactly the same as the previous snippet. This code does exactly the same as the previous snippet.
@ -273,44 +273,44 @@ scene = Scene()
# Create a sensor, film & sample generator # Create a sensor, film & sample generator
scene.addChild(pmgr.create({ scene.addChild(pmgr.create({
'type' : 'perspective', 'type' : 'perspective',
'toWorld' : Transform.lookAt( 'toWorld' : Transform.lookAt(
Point(0, 0, -10), Point(0, 0, -10),
Point(0, 0, 0), Point(0, 0, 0),
Vector(0, 1, 0) Vector(0, 1, 0)
), ),
'film' : { 'film' : {
'type' : 'ldrfilm', 'type' : 'ldrfilm',
'width' : 1920, 'width' : 1920,
'height' : 1080 'height' : 1080
}, },
'sampler' : { 'sampler' : {
'type' : 'ldsampler', 'type' : 'ldsampler',
'sampleCount' : 2 'sampleCount' : 2
} }
})) }))
# Set the integrator # Set the integrator
scene.addChild(pmgr.create({ scene.addChild(pmgr.create({
'type' : 'direct' 'type' : 'direct'
})) }))
# Add a light source # Add a light source
scene.addChild(pmgr.create({ scene.addChild(pmgr.create({
'type' : 'point', 'type' : 'point',
'position' : Point(5, 0, -10), 'position' : Point(5, 0, -10),
'intensity' : Spectrum(100) 'intensity' : Spectrum(100)
})) }))
# Add a shape # Add a shape
scene.addChild(pmgr.create({ scene.addChild(pmgr.create({
'type' : 'sphere', 'type' : 'sphere',
'center' : Point(0, 0, 0), 'center' : Point(0, 0, 0),
'radius' : 1.0, 'radius' : 1.0,
'bsdf' : { 'bsdf' : {
'type' : 'diffuse', 'type' : 'diffuse',
'reflectance' : Spectrum(0.4) 'reflectance' : Spectrum(0.4)
} }
})) }))
scene.configure() scene.configure()
@ -334,17 +334,17 @@ import mitsuba
from mitsuba.core import * from mitsuba.core import *
class MyFormatter(Formatter): class MyFormatter(Formatter):
def format(self, logLevel, sourceClass, sourceThread, message, filename, line): def format(self, logLevel, sourceClass, sourceThread, message, filename, line):
return '%s (log level: %s, thread: %s, class %s, file %s, line %i)' % \ return '%s (log level: %s, thread: %s, class %s, file %s, line %i)' % \
(message, str(logLevel), sourceThread.getName(), sourceClass, (message, str(logLevel), sourceThread.getName(), sourceClass,
filename, line) filename, line)
class MyAppender(Appender): class MyAppender(Appender):
def append(self, logLevel, message): def append(self, logLevel, message):
print(message) print(message)
def logProgress(self, progress, name, formatted, eta): def logProgress(self, progress, name, formatted, eta):
print('Progress message: ' + formatted) print('Progress message: ' + formatted)
# Get the logger associated with the current thread # Get the logger associated with the current thread
logger = Thread.getThread().getLogger() logger = Thread.getThread().getLogger()
@ -810,3 +810,32 @@ Suppose that \code{bitmap} contains a \code{mitsuba.core.Bitmap} instance (e.g.
import numpy as np import numpy as np
array = np.array(bitmap.getNativeBuffer()) array = np.array(bitmap.getNativeBuffer())
\end{python} \end{python}
\subsubsection{Simultaneously rendering multiple versions of a scene}
\begin{python}
for i in range(2):
destination = 'renderedResult' + str(i)
# Create a shallow copy of the scene so that the queue can tell apart the two
# rendering processes. This takes almost no extra memory
newScene = Scene(scene)
newSensor = PluginManager.getInstance().createObject(scene.getSensor().getProperties())
newFilm = PluginManager.getInstance().createObject(scene.getFilm().getProperties())
newFilm.configure()
newSensor.addChild(newFilm)
newSensor.configure()
newScene.addSensor(newSensor)
newScene.setSensor(newSensor)
newScene.setSampler(scene.getSampler())
newScene.setDestinationFile(destination)
# Create a render job and insert it into the queue
job = RenderJob('myRenderJob' + str(i), newScene, queue, sceneResID)
job.start()
# Wait for all jobs to finish and release resources
# It works when this is moved in the loop when only one job at a time is started
queue.waitLeft(0)
queue.join()
\end{python}

View File

@ -4,10 +4,10 @@
\centering \centering
\includegraphics[width=15.5cm]{images/bsdf_overview.pdf} \includegraphics[width=15.5cm]{images/bsdf_overview.pdf}
\caption{ \caption{
Schematic overview of the most important surface scattering models in Schematic overview of the most important surface scattering models in
Mitsuba (shown in the style of Weidlich and Wilkie \cite{Weidlich2007Arbitrarily}). The arrows indicate possible outcomes of an Mitsuba (shown in the style of Weidlich and Wilkie \cite{Weidlich2007Arbitrarily}). The arrows indicate possible outcomes of an
interaction with a surface that has the respective model applied to it. interaction with a surface that has the respective model applied to it.
\vspace{4mm} \vspace{4mm}
} }
\end{figure} \end{figure}
@ -44,22 +44,22 @@ be named and then later referenced by their name.
The following fragment shows an example of both kinds of usages: The following fragment shows an example of both kinds of usages:
\begin{xml} \begin{xml}
<scene version=$\MtsVer$> <scene version=$\MtsVer$>
<!-- Creating a named BSDF for later use --> <!-- Creating a named BSDF for later use -->
<bsdf type=".. BSDF type .." id="myNamedMaterial"> <bsdf type=".. BSDF type .." id="myNamedMaterial">
<!-- BSDF parameters go here --> <!-- BSDF parameters go here -->
</bsdf> </bsdf>
<shape type="sphere"> <shape type="sphere">
<!-- Example of referencing a named material --> <!-- Example of referencing a named material -->
<ref id="myNamedMaterial"/> <ref id="myNamedMaterial"/>
</shape> </shape>
<shape type="sphere"> <shape type="sphere">
<!-- Example of instantiating an unnamed material --> <!-- Example of instantiating an unnamed material -->
<bsdf type=".. BSDF type .."> <bsdf type=".. BSDF type ..">
<!-- BSDF parameters go here --> <!-- BSDF parameters go here -->
</bsdf> </bsdf>
</shape> </shape>
</scene> </scene>
\end{xml} \end{xml}
It is generally more economical to use named BSDFs when they It is generally more economical to use named BSDFs when they
@ -72,13 +72,13 @@ memory usage.
\includegraphics[width=15cm]{images/glass_explanation.pdf} \includegraphics[width=15cm]{images/glass_explanation.pdf}
\vspace{-5mm} \vspace{-5mm}
\caption{ \caption{
\label{fig:glass-explanation} \label{fig:glass-explanation}
Some of the scattering models in Mitsuba need to know Some of the scattering models in Mitsuba need to know
the indices of refraction on the exterior and interior-facing the indices of refraction on the exterior and interior-facing
side of a surface. side of a surface.
It is therefore important to decompose the mesh into meaningful It is therefore important to decompose the mesh into meaningful
separate surfaces corresponding to each index of refraction change. separate surfaces corresponding to each index of refraction change.
The example here shows such a decomposition for a water-filled Glass. The example here shows such a decomposition for a water-filled Glass.
} }
\end{figure} \end{figure}

View File

@ -9,7 +9,7 @@ types is shown below:
\centering \centering
\includegraphics[width=15.5cm]{images/emitter_overview.pdf} \includegraphics[width=15.5cm]{images/emitter_overview.pdf}
\caption{ \caption{
Schematic overview of the most important emitters in Mitsuba. Schematic overview of the most important emitters in Mitsuba.
The arrows indicate the directional distribution of light. The arrows indicate the directional distribution of light.
} }
\end{figure} \end{figure}

View File

@ -9,22 +9,22 @@ more scientifically oriented data formats (e.g. MATLAB or Mathematica).
In the XML scene description language, a normal film configuration might look as follows In the XML scene description language, a normal film configuration might look as follows
\begin{xml} \begin{xml}
<scene version=$\MtsVer$> <scene version=$\MtsVer$>
<!-- ... scene contents ... --> <!-- ... scene contents ... -->
<sensor type="... sensor type ..."> <sensor type="... sensor type ...">
<!-- ... sensor parameters ... --> <!-- ... sensor parameters ... -->
<!-- Write to a high dynamic range EXR image --> <!-- Write to a high dynamic range EXR image -->
<film type="hdrfilm"> <film type="hdrfilm">
<!-- Specify the desired resolution (e.g. full HD) --> <!-- Specify the desired resolution (e.g. full HD) -->
<integer name="width" value="1920"/> <integer name="width" value="1920"/>
<integer name="height" value="1080"/> <integer name="height" value="1080"/>
<!-- Use a Gaussian reconstruction filter. For <!-- Use a Gaussian reconstruction filter. For
details on these, refer to the next subsection --> details on these, refer to the next subsection -->
<rfilter type="gaussian"/> <rfilter type="gaussian"/>
</film> </film>
</sensor> </sensor>
</scene> </scene>
\end{xml} \end{xml}
The \code{film} plugin should be instantiated nested inside a \code{sensor} declaration. The \code{film} plugin should be instantiated nested inside a \code{sensor} declaration.

View File

@ -15,16 +15,16 @@ is usually instantiated by declaring it at the top level within the
scene, e.g. scene, e.g.
\begin{xml} \begin{xml}
<scene version=$\MtsVer$> <scene version=$\MtsVer$>
<!-- Instantiate a unidirectional path tracer, <!-- Instantiate a unidirectional path tracer,
which renders paths up to a depth of 5 --> which renders paths up to a depth of 5 -->
<integrator type="path"> <integrator type="path">
<integer name="maxDepth" value="5"/> <integer name="maxDepth" value="5"/>
</integrator> </integrator>
<!-- Some geometry to be rendered --> <!-- Some geometry to be rendered -->
<shape type="sphere"> <shape type="sphere">
<bsdf type="diffuse"/> <bsdf type="diffuse"/>
</shape> </shape>
</scene> </scene>
\end{xml} \end{xml}
@ -68,13 +68,13 @@ method (\pluginref{mlt}, \pluginref{erpt}).
\smallrendering{Max. depth = 3}{pathdepth-3} \smallrendering{Max. depth = 3}{pathdepth-3}
\smallrendering{Max. depth = $\infty$}{pathdepth-all} \smallrendering{Max. depth = $\infty$}{pathdepth-all}
\caption{ \caption{
\label{fig:pathdepths} \label{fig:pathdepths}
These Cornell box renderings demonstrate the visual These Cornell box renderings demonstrate the visual
effect of a maximum path depth. As the paths effect of a maximum path depth. As the paths
are allowed to grow longer, the color saturation are allowed to grow longer, the color saturation
increases due to multiple scattering interactions increases due to multiple scattering interactions
with the colored surfaces. At the same time, the with the colored surfaces. At the same time, the
computation time increases. computation time increases.
} }
\end{figure} \end{figure}
@ -94,10 +94,10 @@ depth unlimited.
\includegraphics[width=10cm]{images/path_explanation.pdf} \includegraphics[width=10cm]{images/path_explanation.pdf}
\vspace{-5mm} \vspace{-5mm}
\caption{ \caption{
\label{fig:path-explanation} \label{fig:path-explanation}
A ray of emitted light is scattered by an object and subsequently A ray of emitted light is scattered by an object and subsequently
reaches the eye/sensor. reaches the eye/sensor.
In Mitsuba, this is a \emph{depth-2} path, since it has two edges. In Mitsuba, this is a \emph{depth-2} path, since it has two edges.
} }
\end{figure} \end{figure}
Mitsuba counts depths starting at $1$, which correspond to Mitsuba counts depths starting at $1$, which correspond to

View File

@ -3,16 +3,16 @@
\label{sec:media} \label{sec:media}
\vspace{-1cm} \vspace{-1cm}
\renderings{ \renderings{
\subfloat[A knitted sheep sweater (Ridged Feather pattern)]{ \subfloat[A knitted sheep sweater (Ridged Feather pattern)]{
\fbox{\includegraphics[width=0.58\textwidth]{images/medium_sheep}}}\hfill \fbox{\includegraphics[width=0.58\textwidth]{images/medium_sheep}}}\hfill
\subfloat[A knitted sweater for an alien character (Braid Cables pattern)]{ \subfloat[A knitted sweater for an alien character (Braid Cables pattern)]{
\fbox{\includegraphics[width=0.32\textwidth]{images/medium_alien_cables}}}\hfill \fbox{\includegraphics[width=0.32\textwidth]{images/medium_alien_cables}}}\hfill
\vspace{-2mm} \vspace{-2mm}
\caption{Participating media are not limited to smoke or fog: they are \caption{Participating media are not limited to smoke or fog: they are
also great for rendering fuzzy materials such as these knitted sweaters also great for rendering fuzzy materials such as these knitted sweaters
(made using the \pluginref{heterogeneous} and \pluginref{microflake} plugins). (made using the \pluginref{heterogeneous} and \pluginref{microflake} plugins).
Figure courtesy of Yuksel et al. \cite{Yuksel2012Stitch}, models courtesy of Figure courtesy of Yuksel et al. \cite{Yuksel2012Stitch}, models courtesy of
Rune Spaans and Christer Sveen.} Rune Spaans and Christer Sveen.}
} }
In Mitsuba, participating media are used to simulate materials ranging from In Mitsuba, participating media are used to simulate materials ranging from
fog, smoke, and clouds, over translucent materials such as skin or milk, fog, smoke, and clouds, over translucent materials such as skin or milk,
@ -31,14 +31,14 @@ referencing mechanism:
\begin{xml} \begin{xml}
<medium type="homogeneous" id="fog"> <medium type="homogeneous" id="fog">
<!-- .... homogeneous medium parameters .... --> <!-- .... homogeneous medium parameters .... -->
</medium> </medium>
<sensor type="perspective"> <sensor type="perspective">
<!-- .... perspective camera parameters .... --> <!-- .... perspective camera parameters .... -->
<!-- Reference the fog medium from within the sensor declaration <!-- Reference the fog medium from within the sensor declaration
to make it aware that it is embedded inside this medium --> to make it aware that it is embedded inside this medium -->
<ref id="fog"/> <ref id="fog"/>
</sensor> </sensor>
\end{xml} \end{xml}

View File

@ -72,23 +72,23 @@ Here, a high frequency function is reconstructed at low resolutions. A good filt
resolution and attenuate the remainder to a uniform gray. The filters are ordered by their resolution and attenuate the remainder to a uniform gray. The filters are ordered by their
approximate level of success at this benchmark. approximate level of success at this benchmark.
\renderings{ \renderings{
\subfloat[A high resolution input image whose frequency decreases \subfloat[A high resolution input image whose frequency decreases
towards the borders. If you are looking at this on a computer, you may towards the borders. If you are looking at this on a computer, you may
have to zoom in.]{\fbox{\includegraphics[width=0.43\textwidth]{images/rfilter_sines_input}}} have to zoom in.]{\fbox{\includegraphics[width=0.43\textwidth]{images/rfilter_sines_input}}}
\hfill \hfill
} }
\vspace{-4mm} \vspace{-4mm}
\renderings{ \renderings{
\medrendering{Box filter}{rfilter_sines_box} \medrendering{Box filter}{rfilter_sines_box}
\medrendering{Tent filter}{rfilter_sines_tent} \medrendering{Tent filter}{rfilter_sines_tent}
\medrendering{Gaussian filter}{rfilter_sines_gaussian} \medrendering{Gaussian filter}{rfilter_sines_gaussian}
} }
\vspace{-4mm} \vspace{-4mm}
\renderings{ \renderings{
\setcounter{subfigure}{3} \setcounter{subfigure}{3}
\medrendering{Mitchell-Netravali filter}{rfilter_sines_mitchell} \medrendering{Mitchell-Netravali filter}{rfilter_sines_mitchell}
\medrendering{Catmull-Rom filter}{rfilter_sines_catmullrom} \medrendering{Catmull-Rom filter}{rfilter_sines_catmullrom}
\medrendering{Lanczos Sinc filter}{rfilter_sines_lanczos} \medrendering{Lanczos Sinc filter}{rfilter_sines_lanczos}
} }
\newpage \newpage
\subsubsection{Reconstruction filter comparison 2: ringing} \subsubsection{Reconstruction filter comparison 2: ringing}
@ -97,39 +97,39 @@ image contains extreme and discontinuous brightness transitions. The
Mitchell-Netravali, Catmull-Rom, and Lanczos Sinc filters are affected by this problem. Mitchell-Netravali, Catmull-Rom, and Lanczos Sinc filters are affected by this problem.
Note the black fringing around the light source in the cropped Cornell box renderings below. Note the black fringing around the light source in the cropped Cornell box renderings below.
\renderings{ \renderings{
\rendering{Box filter}{rfilter_cbox_box} \rendering{Box filter}{rfilter_cbox_box}
\rendering{Tent filter}{rfilter_cbox_tent} \rendering{Tent filter}{rfilter_cbox_tent}
} }
\vspace{-4mm} \vspace{-4mm}
\renderings{ \renderings{
\setcounter{subfigure}{2} \setcounter{subfigure}{2}
\rendering{Gaussian filter}{rfilter_cbox_gaussian} \rendering{Gaussian filter}{rfilter_cbox_gaussian}
\rendering{Mitchell-Netravali filter}{rfilter_cbox_mitchell} \rendering{Mitchell-Netravali filter}{rfilter_cbox_mitchell}
} }
\vspace{-4mm} \vspace{-4mm}
\renderings{ \renderings{
\setcounter{subfigure}{4} \setcounter{subfigure}{4}
\rendering{Catmull-Rom filter}{rfilter_cbox_catmullrom} \rendering{Catmull-Rom filter}{rfilter_cbox_catmullrom}
\rendering{Lanczos Sinc filter}{rfilter_cbox_lanczos} \rendering{Lanczos Sinc filter}{rfilter_cbox_lanczos}
} }
\subsubsection{Specifying a reconstruction filter} \subsubsection{Specifying a reconstruction filter}
To specify a reconstruction filter, it must be instantiated inside To specify a reconstruction filter, it must be instantiated inside
the sensor's film. Below is an example: the sensor's film. Below is an example:
\begin{xml} \begin{xml}
<scene version=$\MtsVer$> <scene version=$\MtsVer$>
<!-- ... scene contents ... --> <!-- ... scene contents ... -->
<sensor type="... sensor type ..."> <sensor type="... sensor type ...">
<!-- ... sensor parameters ... --> <!-- ... sensor parameters ... -->
<film type="... film type ..."> <film type="... film type ...">
<!-- ... film parameters ... --> <!-- ... film parameters ... -->
<!-- Instantiate a Lanczos Sinc filter with two lobes --> <!-- Instantiate a Lanczos Sinc filter with two lobes -->
<rfilter type="lanczos"> <rfilter type="lanczos">
<integer name="lobes" value="2"/> <integer name="lobes" value="2"/>
</rfilter> </rfilter>
</film> </film>
</sensor> </sensor>
</scene> </scene>
\end{xml} \end{xml}

View File

@ -13,19 +13,19 @@ This BSDF characterizes what happens \emph{at the surface}. In the XML scene des
the following: the following:
\begin{xml} \begin{xml}
<scene version=$\MtsVer$> <scene version=$\MtsVer$>
<shape type="... shape type ..."> <shape type="... shape type ...">
... $\code{shape}$ parameters ... ... $\code{shape}$ parameters ...
<bsdf type="... bsdf type ..."> <bsdf type="... bsdf type ...">
... $\code{bsdf}$ parameters .. ... $\code{bsdf}$ parameters ..
</bsdf> </bsdf>
<!-- Alternatively: reference a named BSDF that <!-- Alternatively: reference a named BSDF that
has been declared previously has been declared previously
<ref id="myBSDF"/> <ref id="myBSDF"/>
--> -->
</shape> </shape>
</scene> </scene>
\end{xml} \end{xml}
@ -35,24 +35,24 @@ of the shape. This informs the renderer about what happens in the region of spac
\begin{xml} \begin{xml}
<scene version=$\MtsVer$> <scene version=$\MtsVer$>
<shape type="... shape type ..."> <shape type="... shape type ...">
... $\code{shape}$ parameters ... ... $\code{shape}$ parameters ...
<medium name="interior" type="... medium type ..."> <medium name="interior" type="... medium type ...">
... $\code{medium}$ parameters ... ... $\code{medium}$ parameters ...
</medium> </medium>
<medium name="exterior" type="... medium type ..."> <medium name="exterior" type="... medium type ...">
... $\code{medium}$ parameters ... ... $\code{medium}$ parameters ...
</medium> </medium>
<!-- Alternatively: reference named media that <!-- Alternatively: reference named media that
have been declared previously have been declared previously
<ref name="interior" id="myMedium1"/> <ref name="interior" id="myMedium1"/>
<ref name="exterior" id="myMedium2"/> <ref name="exterior" id="myMedium2"/>
--> -->
</shape> </shape>
</scene> </scene>
\end{xml} \end{xml}
@ -66,29 +66,29 @@ It is also possible to create \emph{index-mismatched} boundaries between media,
the light is affected by the boundary transition: the light is affected by the boundary transition:
\begin{xml} \begin{xml}
<scene version=$\MtsVer$> <scene version=$\MtsVer$>
<shape type="... shape type ..."> <shape type="... shape type ...">
... $\code{shape}$ parameters ... ... $\code{shape}$ parameters ...
<bsdf type="... bsdf type ..."> <bsdf type="... bsdf type ...">
... $\code{bsdf}$ parameters .. ... $\code{bsdf}$ parameters ..
</bsdf> </bsdf>
<medium name="interior" type="... medium type ..."> <medium name="interior" type="... medium type ...">
... $\code{medium}$ parameters ... ... $\code{medium}$ parameters ...
</medium> </medium>
<medium name="exterior" type="... medium type ..."> <medium name="exterior" type="... medium type ...">
... $\code{medium}$ parameters ... ... $\code{medium}$ parameters ...
</medium> </medium>
<!-- Alternatively: reference named media and BSDF <!-- Alternatively: reference named media and BSDF
instances that have been declared previously instances that have been declared previously
<ref id="myBSDF"/> <ref id="myBSDF"/>
<ref name="interior" id="myMedium1"/> <ref name="interior" id="myMedium1"/>
<ref name="exterior" id="myMedium2"/> <ref name="exterior" id="myMedium2"/>
--> -->
</shape> </shape>
</scene> </scene>
\end{xml} \end{xml}
This constitutes the standard ways in which a shape can be declared. This constitutes the standard ways in which a shape can be declared.