This page summarizes the graphics design plan of the OpenJDK Mac port.
Requirements
0. Support generic Java2D drawing in ordinary heavyweight
...
NSWindows
1. Support embedding applet content in NPAPI browsers across processes
2. Support embedding other Quartz or OpenGL based drawing via JAWT
...
- including sub-embedding via JAWTin applet content
3. Be highly performant
- including sub-embedding via JAWTin applet content
Remaining work
- Connect the Java2D OpenGL rendering pipe to an IOSurface texture for each content
...
- view
- Wire the CAOpenGLLayer to copy the entire IOSurface texture (with appropriate locking on the IOSurface)
- Implement the JAWT API to connect provided CALayers as sub-
...
- layers of the top-level window's CALayer
- Complete implementations of the "lightweight" AWT controls
- Determine which surface primitive to expose for out-of-process rendering in a web browser plug-in: CALayers or IOSurfaces
- Connect the NPAPI plug-in to the chosen mechanism
Completed work
- Render all Java2D drawing in OpenGL
- Bring up Cocoa-based event system
- Prototyped CAOpenGLLayer drawing directly from Java2D OpenGL rendering pipe
- Requires further API to be exposed
- Fails to draw entire scene, need intermediate buffer Requires further API to be exposed
- Prototyped IOSurface drawing into an NPAPI plug-in
- Does not connect to Java2D OpenGL rendering pipe
- Cannot support sub-embedding (JAWT)
- Determined that both and IOSurface and a CALayer need to be used together to form a complete working system
Q&A
Q: Why use CoreAnimation layers?
A: CALayers can be connected to a hierarchical tree of arbitrary rendering surfaces across a variety of rendering technologies (not just OpenGL), which make them ideal for embedding content from disparate technologies (SWT, QuickTime, JOGL, LWJGL, etc). These trees can have branches that span across processes, but require new API to be introduced to JavaRuntimeSupport.framework to do so. CALayers can only be drawn to on the main (AppKit) thread.
Q: Why use IOSurfaces?
A: IOSurfaces can wrap OpenGL texture resources and share them between processes and the graphics card. An IOSurface is a good container for the pixels of the Java2D OpenGL pipeline, because there is no unusual threading requirements and the texture can hold the entire Java2D scene (unlike a CALayer which retains no state). IOSurfaces have no natural affordance for embedding, layering, or chaining sub-surfaces, so they are not an appropriate substrate for embedding.
Q: Why not use CoreGraphics drawing in applet plug-ins?
A: CoreGraphics draws to in-process memory. CoreGraphics has no natural affordance for either cross-process drawing nor embedded drawing. Using CoreGraphics (as opposed to CoreAnimation) drawing in an NPAPI plug-in is unlikely to achieve a simple nor performant solution.
Q: Why use OpenGL? What happened to the Quartz and Sun2D renderers in Apple's Java SE 6?
A: The Quartz and Sun2D rendering pipelines in Java SE 6 are strictly in-memory only drawing routines which target the window back-buffer which is shared memory with the WindowServer. Since WindowServer windows are not easily shared (impossible using only API) and are in-memory only structures, they do not form the ideal substrate to build a performant, cross-process, and embeddable graphics system on.
Q: Can Quartz and CALayers be used to make a functional graphics?
A: Yes, but CALayers require all drawing to occur on the main (AppKit) thread, and the drawing would still be going to a malloc()'d in-memory array of pixels. OpenGL commands directly to an IOSurface texture provide us a way to drive drawing directly to the card, and then flip the entire scene into a (potentially shared) CALayer on the main thread.