- Loading...
| CSS Stylesheet |
|---|
.wiki-content .confluenceTh,
.wiki-content .confluenceTd {
border: 2px solid #e76f00;
} |
| This feature was delivered into JDK 7u4 which was released April 2012 and was subsequently dissolved February 2020. Discussion about ports may be found on porters-dev. |
0. Support generic Java2D drawing in ordinary heavyweight NSWindows
1. Support embedding applet content in NPAPI browsers across processes
2. Support embedding other Quartz or OpenGL based drawing via JAWT
...
...
A: CoreGraphics draws to in-process memory. CoreGraphics has no natural affordance for either cross-process drawing nor embedded drawing. Using CoreGraphics (as opposed to CoreAnimation) drawing in an NPAPI plug-in is unlikely to achieve a simple nor performant solution.
A: Yes, but CALayers require all drawing to occur on the main (AppKit) thread, and the drawing would still be going to a malloc()'d in-memory array of pixels. OpenGL commands directly to an IOSurface texture provide us a way to drive drawing directly to the card, and then flip the entire scene into a (potentially shared) CALayer on the main thread.
A: Every window has an NSOpenGLContext which targeted by the Java2D OpenGL renderer, as well as an off-screen "scratch" context. This NSOpenGLContext is assigned directly to the root AWTView of the NSWindow, which basically connects the window's back-buffer pixels to the OpenGL context.
A: The Quartz and Sun2D rendering pipelines in Java SE 6 are strictly in-memory only drawing routines which target the window back-buffer which is shared memory with the WindowServer. Since WindowServer windows are not easily shared (impossible using only API) and are in-memory only structures, they do not form the ideal substrate to build a performant, cross-process, and embeddable graphics system on.
...
A: Since the Quartz and Sun2D rendering pipelines in Java SE 6 only support rendering to an NSWindow (among other obnoxious requirements of the NSView-based AWT heavyweights), applet content is rendering into an invisible NSWindow. After each update, a request is punted onto A: Yes, but CALayers require all drawing to occur on the main (AppKit) thread, and the drawing would still be going to a malloc()'d in-memory array of pixels. OpenGL commands directly to an IOSurface texture provide us a way to drive drawing directly to the card, and then flip the entire scene into a (potentially shared) CALayer on the main threadwhich then creates a CGImageRef from the updated rectangle of the underlying window back-buffer pixels, and copies it into a CALayer. This CALayer is shared across processes back to the applet plug-in process, and is vended directly to the NPAPI via the CoreAnimation drawing model.