Fabrice Neyret's List  of
Big Questions to be solved
(within his research themes)


Back to my research page.
Back to my personal page.
Back to my debates page.

(June,6 2008)


Here are some "big questions" of our domain that puzzle me. This by no way means that I gonna work on all of them currently or in the future. Nor that I will solve them definitively for those I would work on (these are *complex* questions). Nor that my unprobable achieved solutions would be the only possible ones. So, feel authorized and even invited to reappropriate these questions, work on them, and propose your solutions, or at least your own steps ! Everybody is welcome to help pushing the knowledge frontier one step away.

The order and size of topics is random, depending on inspiration. My themes are at the connection between physics, appearance of the real world, and perception. My topics cross fluid animation and look (water, clouds, smoke...), appearance of details (either 2D or 3D, still or animated), and how to design minimal representations (link between knowledge and appearance through scales, including control and transitions). Also, a lot of issues are transversal (filtering, sampling, interpolation...).
It's transversal, but if you really want some direct access, a rough index could be:
- texture and blending (including animation),
- filtering and representations,
- Physics,
- plus some extras: classical bad ideas and more refs.


Textures

What is a texture ? How to classify categories ?
  • Some resynthesis algorithms totally succeed or fail on some kinds of images. Could we characterize the missing or present properties ?
  • Procedural methods generate a rich space of textures, which is not the whole space. Could we characterize this, at least for the classical combination of algorithm and filters ? (Perlin, Worley, abs, displacement, lookup...)
  • Some texture are isotropic & homogeneous at large scale and can thus seamlessly map on unflat surfaces. Some have one main orientation which can flow on the surface. Some have short-length loosy 2D organization that can still fit curvature. But some (like checkers) implicitly embed a 2D parameterization and thus cannot map seamlessly. Can we characterize the "well-mapable" ability better than "isotropic / non-isotropic" ?
  • For some real textures the interesting data is the spectrum. For others, it's pixels neighborhoods (shapes). For other, it's the multiscale histogram characteristics. Can we better characterize ? I.e. what is meaningful for what ?
  • Why do we generate procedural texture using octaves ? Guess: because the unitary spectrum is large so the result is a continuous power law. Consequence for other spectrums ? (e.g. stipple patterns). What really represent lacunarity and power law exponent in real textures ? Hint: consider the property after reconstruction, not of the octave amplitude factor.
  • Getting new categories of look for procedural textures: 2D looks great, but 3D is not as easy. What about 3D trabeculum patterns ? what about 3D spikes ? 3D lobes ? And in 2D, how about paint-like, stipples-like ? Wood veins with inclusions ?
  • How to bring Gardner's pseudo-volume noise approach to non-convex non-intersecting surfaces ?
  • How to mimic cloud wisps ? How to link it with high-level physics ?
  • Same for the appearance of cloud fields, lavas, rocks and cliff, canopies, cloth folds, crushed paper...
  • Prefiltering procedural textures: Can we do more than culling the octaves ? how far can we do ? Hint: don't forget to account for transforms, not only the base noise.

    Animated textures, textures of motion

  • How to animate, morph, resample procedural textures during advection ?
  • Flownoise suggests rotations as bumpmap suggests relief. How to exploit this for control, paradoxical interpolation (e.g., heterogeneous rotation field), conforming to a vorticity spectrum ? (e.g. Kolmogorov).
  • How to mimic the billowing shape of clouds and smoke ? How to link it with high-level physics ? Guess: modes in vortex filaments.
  • How to mimic a turbulence volume pattern and its properties ? Hint: high-level physics of a soup of vortex filaments. How to mimic instability surface patterns ? (as texture or as source of macroscopic features).
  • River surfaces. Lavas. Foam textures (river, wave crest, swirl pattern after breaking, or in a wake). Wind patterns (cat's paws on water and grass, wind sand threads, canopie shaking pattern).
  • What smart to do knowing the various instability patterns ? (Kelvin-Helmholtz, Taylor-Rayleigh, von Karman, Kolmogorov cascade, rollwaves, dispersion laws, thermodynamics, standard atmosphere...)

    Blending patterns

    Linearly blending the resulting colors is often the poorest idea: it makes sense only for linear phenomena (e.g. lightmaps, heightfield of waves). It yields ghosting effects.
  • Hint: When you want to blend in time, you can try to morph (i.e. interpolating explicit or implicit high-level parameters describing the image). When you want to blend in space, you can try the same (blending parameters).
  • This supposes to have descriptors to analyze/regenerate the images (if there is no first-hand descriptors).
  • First-hand descriptors (e.g. for procedural textures) might not have the right properties to be linearly interpolated so as to get seamless results (lost of some property). This supposes to get or adapt descriptors.
  • Interpolating texture descriptors will induces various appearant motions: wanted main motion plus parasite secondary motion. How to predict, control, characterize the secondary motions ? Same question for any space or time continuous change of a procedural texture parameter.

    Filtering color LookUp Tables and volume transfer functions

    These are very non-linear transforms, so linear filtering yields very bad results.
  • How to deal with filtering, MIP-mapping ? What is the meaning of this transform in terms of designing shapes and material ? (cf structures or grains shaped by using a sigmoid on a smooth field). This has to deal with properly defining the LOD, the blending.
  • How to filter reflection maps (envmaps) ?
  • How to filter normal maps, height-maps ? LinkTo: next section.

    Representations as scale-adapted proxies for appearance

  • High-level knowledge for patterns, animation, scene rendering: What smart to do if we know the spectrum ? The (auto)correlation ? Space-angle accessibility/occlusion ? Distributions of such property ?
  • How to progressively filter shapes (having a BRDF) into BRDF or NDF or XXX ?
  • Case of animation (ocean waves at horizon). Combination with envmaps and illumination (ocean and mountains from satellite view).
  • How to keep memory of the geometry inside the BRDF ? (for further geometric manipulation, e.g., stretching).
  • How to account for local occlusion ? (with or without a-priori knowledge on distribution: volume of leaves vs complex terrain).
  • What is the consequence of density gradient or density texture (on the cloud border, or inside) on lighting (transmission, reflection) ? LinkTo: validity of linear averaging of the density field.
  • Same question for dust in water and air: role of uncolored secondary diffusers in the color and intensity of participating media.
  • Do we really see a full dense 3D field (e.g. dense tree branches on winter) with the 2D visual maps of our brain ? Why old parallax pseudo-perspective animation fools the eyes so well ? Can we use this relief-myopia for more efficient representations ?

    Infinite zoom

    How to define it properly ? Several tracks are: scaling the scene, or the observer, allowing a point-observer (no near-plane).
  • How to deal with near plane and far plane ? z precision ? how to adapt velocity ?
  • For volume material we have to cheat otherwise it turns totally transparent: how to do this on a coherent way (including for MIP-mapping) ?
  • Infinite zoom in procedural texture: how to fade or cache the high scales ? how to filter correctly (or to cheat correctly) the small scales ? How to detect empty/filled areas for efficient synthesis ?
  • Infinite zoom on surfaces (zooming on a cliff). Necessary cheats (fade big relief and shadowed regions).
  • How to light and shadow these huge and deep scenes ?

    Physics and fluids

  • How to extrapolate 2D high-level physics to 3D ? (e.g. Bernoulli-base river height.)
  • How to extrapolate to near-field ? (e.g. waves near boat or obstacle.)
  • How to extrapolate to heterogeneous environments ? (e.g. waves upstream an obstacle.)
  • How to extrapolate to scale-combination ? (e.g. ripple on waves.)
  • How to get the amplitude spectrum ? (e.g. boat or obstacle waves.)
    [LinkedTo: textured animation:]
  • Physics of vortex filaments: auto-modes, couplings, modes in a 'texture' unit volume.
  • Physics of turbulence. Properties of a unit volume. Link with vortex filaments.
  • Physics of fluid instabilities and their combination (e.g. billowing).
  • Physics of water surfaces as analytic waves for practical realistic cases.
  • Physics of wind-surface interaction (water cat's paws, wind sand threads).
  • Shape and animation of convective clouds, aerosol avalanches, billowing smoke.
  • Texture of clouds wisps, links with vortex filaments, turbulence, thermodynamics.

    Techniques and tool I should learn about (whenever available)

  • Scale-dependant notion of normal, curvature.
  • Smart interpolation of non-scalar data (vector/directions, rotations, tensors)...
  • Sampling and reconstruction of heterogeneous anisotropic fields. Case of animation.
  • Reconstruction (interpolation) in unstructured meshes. Case of polygonal cells with n>4.
  • Delaunay triangulation on curve surfaces.
  • Perceptive metrics for fix patterns, animated, and detection of visible flaws or invisible wastes.
  • More physics. Physical intuition. More instability modes.

    Classical bad ideas

    (Often, not always).
  • Designing a model, representation or algorithm without scope, goal, explicit hypothesis, or quality metrics.
  • Invoking "ground truth" when simulating a physical phenomenon.
  • Continuity by blending or morphing any parameter, without quality criterion.
  • "Plenty of parameters" instead of "dense, interesting and controllable generated space".
  • Preserving some constant (e.g. volume) by simple global operation (e.g. renormalization). Cause: the real constraint might be something thinner (e.g. locally divergence-free).
  • Solving contradictory requirements on a set (e.g. image) by designing an energy function summed on all pixels, to be minimized. Cause: this tends to concentrate problems on zero-width features such as point or curves. But these can be very visible, or can kill unexplicit constraints such as continuity and homogeneity *everywhere* ('almost' allows jumps).
  • Linear interpolation of non-scalar data or of non-linear transforms.
  • Manipulating co-vectors like vectors (e.g., geometric transforms on normals or bumpmaps).

    More lists of thoughts

  • My AFIG'06 invited paper (in French).
  • The introduction (chapters 1&2) of my Habilitation dissertation (in French).
  • A collection of "unsolved problems" lists made by famous CG people.
  • My gallery of procedural textures.