A work in progress game engine.
Status of building all examples:
See the executables from ghegin-games.cabal in the examples/ directory.
The working examples are the following. In sequence, they build up to a
"user-space" higher-level engine using the Core of the engine only (ghengin-core).
Solid examples:
simple-triangleis a simple 2D trianglesimple-triangle-coloredsimple-cubeis a simple 3D cube (no projections)simple-cameraexample of camera with projectionfull-pipelinea full shader pipeline example ported from FIRfunction-drawingattempt at drawing functions using shaders
Experimental examples:
function-plottingattempt a drawing functions using verticesoscilloscopeattempt at drawing with an oscilloscope (UNFINISHED)lorenz-attractorthe lorenz attractor; currently only static shader image
Implementing write-ups:
book-of-shaders-3Chapter 3 of Book of Shadersbook-of-shaders-5Chapter 5 of Book of Shaders
We have a nix derivation to set-up a shell with all required Vulkan and GHC dependencies. You can enter it using:
nix-shell
But it should also be possible to run the project if you provide all required
dependencies. You can take a look in shell.nix to get an idea of what they
are.
After that, we use cabal to build and run examples. For instance:
cabal run exe:simple-triangle
# or
cabal run exe:simple-camera
# or
cabal repl exe:simple-cube
- Shader first -- the engine is design with custom shaders in the center, and a lot of compile time validation and runtime data is based on the shader
- Compile time validation of compatibility between the game defined materials, meshes and the game defined shader programs.
- The core of the engine is all in the linear IO monad (Haskell + linear types).
- Many more...
I haven't had much time to write about this, but the key ideas are:
-
Things inserted into a
RenderQueueare things that get rendered. Each render packet is essentially defined by:- A
Mesh - a
Material - and a
RenderPipeline
- A
-
Meshes are vertices together with properties to influence the render of these vertices, and are parametrized by:
- A type list describing the properties of each vertex in this mesh
- (This is not yet implemented:) A type list describing the property
bindings that describe this mesh and get bound to descriptor set #2 for
each different mesh that is drawn.
- Note that multiple render packets sharing the same mesh can be drawn while the mesh properties being still only bound once.
-
Materials are group of properties that influence how all render packets sharing this Material are rendered; it is parametrized by:
- A type list with the type of each property describing this material, which
will get bound once to descriptor set #1 for every different material.
- Note that multiple render packets with different meshes may share the
same material, and the material properties will be shared across mesh draws without being rewritten
- (Each material may get bound more than once, if there's no clear serialization of draw calls that ensures the material only needs to be bound once -- this has to due with heuristics in the render queue, I don't recall all the details)
- Note that multiple render packets with different meshes may share the
same material, and the material properties will be shared across mesh draws without being rewritten
- A type list with the type of each property describing this material, which
will get bound once to descriptor set #1 for every different material.
-
Render pipelines are group of properties and descriptions of render pipelines in graphics parlor, that define how all render packets that share this render pipeline are rendered (across different materials and meshes); it is parametrized by:
- A type list describing the properties shared accross all render packets drawn with this render pipeline, that will be bound in descriptor set #0
- A type-level complete description of the shader, which is the type of the shader program in the FIR shader language.
-
The
Compatibleconstraint must be satisfied in order to construct a render packet. This constraint validates, at compile time, that:- For the
Mesh(see alsoCompatibleMesh)- The properties of each vertex match the vertice properties expected by the shader
- (This is not yet implemented:) The mesh properties match the properties expected to be bound at descriptor set #2 in the shader
- For the
Material(see alsoCompatibleMaterial)- The properties of the material match the properties expected to be bound at descriptor set #1 by the shader
- For the
RenderPipeline(see alsoCompatiblePipeline)- The properties of the pipeline match the properties expected to be bound at descriptor set #0 by the shader
- For the
-
...
-
The Core of the engine is abstract over the renderer implementation (through backpack), though we only have a vulkan implementation of the renderer, and the Core isn't yet fully standalone
-
The Core of the engine is much like the Core in GHC: it strives to be a tiny but very expressive engine, that can represent in its completeness the full engine (which provides additional features not directly available in Core, but that can be expressed in it), for example:
- The
Cameraconstruct is not part of Core, for it can be fully defined as aRenderPipelineproperty that gets bound in descriptor set #0 once per render pipeline, and some shader math. Of course, this ought to be provided as a plug and play capability in the full engine (say, one just has to import the Camera module, add it as a property of the render pipeline, and call the imported camera shader function in their own shader)- It's prettty good how in the shaders being written in Haskell one can easily use other engine-defined shader functions
- ...
- The
Some resources:
- https://developer.nvidia.com/blog/vulkan-dos-donts/
- https://zeux.io/2020/02/27/writing-an-efficient-vulkan-renderer/
- https://www.intel.com/content/www/us/en/developer/articles/training/api-without-secrets-introduction-to-vulkan-part-6.html
- https://arm-software.github.io/vulkan_best_practice_for_mobile_developers/samples/performance/descriptor_management/descriptor_management_tutorial.html
- Creating the Art of ABZU
What ghengin-core does and does not:
- Does not implement a game-loop, nor a time step strategy
- Does not implement a scene-graph
- Nor (game) object positioning in terms of so-called "world coordinates"
- Does not provide a camera
- Does not manage game objects/entities in any way (no ECS, no FRP, actually, no concept of game object whatsover)
- Does not have a UI abstraction, but will allow renderpasses to be managed in such a way that one can add outside of Core, e.g., a dear-imgui render pass
- Has an (editable) Render Queue with the render packets (meshes + properties + material properties + pipeline with shader) that are rendered every frame
- Can express all of the above things it "does not" do with the existing concepts and combinators
- Handles double-buffering (eventually configurable?) semantics for the 'render' function, i.e. blocks after drawing a second frame
- Actually, it's the renderer implementation that handles this
These add-ons exist as separate packages, and are all included in ghengin, the
batteries included engine. These also attempt to be somewhat independent from
ghengin-core when possible.
Work in progress ones:
ghengin-camera, a camera object, shader, and update function (i.e. a camera, in its usual meaning)ghengin-geometry, some geometry things
Unimplemented ideas:
ghengin-scene-graph, which defines a scene-graph and world coordinate space with objects related in a hieararchy with properties defined relative to their parents (i.e. a scene, in its usual meaning)ghengin-models, to load and render 3D modelsghengin-lighting, that provides lighting functions/models like the Blinn-Phong modelghengin-dearimgui, for UIs based on ghengin
ghengin will provide game-development abstractions on top of ghengin-core, and
will be more developer friendly in the sense that it does not require linear types
everywhere, rather only when expressely necessary.
It is future work.