@penguin42 I just finished watching this fine series of lectures on #Vulkan.
Episode 6 is specifically about real-time #RayTracing, and explains how shaders are bound to objects in the acceleration structure, one of the concepts I couldn't figure out by looking at the source code of the demos.
@jf I just finished reading the blogpost, which seems well written. I've always wondered why libappindciator wasn't deemed good enough.
Now I know, but I'm not sure why we couldn't address the issues with minor API changes to keep apps working and buy us the time to develop a solid cross-desktop protocol.
By the way, are the new Background Apps based upon systemd scopes? Do they work with Flatpak apps? Is there a FreeDesktop spec or other design documentation?
Disrupting workflows which are important to a significant share of users undermines their trust. Doing so intentionally is even worse: it gets them *angry* at us developers! By now we should have learned the lesson.
Apple and Microsoft, while not flawless, have done a better job at retaining their userbase than #GNOME and #KDE historically did.
That's one reason - perhaps the top reason! - why it's never "the year of linux on the desktop".
#Wayland took years to reach feature parity with X11, and there are still some rough edges.
Now suppose this was done without the escape hatch of an X11 session. Users with NVidia cards and users who need specific screen sharing apps would be up in arms.
@jf Probably my audio needs are very basic, and were covered when #Pipewire was deemed good enough to replace Pulseaudio in #Fedora.
No software is perfect, that wasn't my point. I'd call it a smooth transition when 95% of users are happy, and the rest can easily rollback for some time.
@jf@dropbear42 No matter how badly designed, before removing an API that many popular apps rely on, you should at least have a replacement already implemented, but #GNOME didn't even have a design.
This forced Ubuntu and other downstream distros to come up with *something* to keep the UX from breaking.
Good developers don't merely design good APIs. They also design a smooth transition between old and new APIs.
The Linux desktop needs a well-designed model for managed apps, but it's taking longer than expected.
Android, iOS, macOS and even Windows have mature app stores with sandboxing, automatic updates, APIs for resource access and for requesting permissions.
Linux has two competing standards (#Flatpak and #Snap) and multiple barely-working GUI installers: Gnome Software, Ubuntu Software Center, Plasma Discover. If you tried searching for apps, you know exactly what I mean.
ยซFive possible positions can be chosen for each of the two "playfield fingers." For example, you can place playfield 1 on top of sprites 0 and 1 (0), between sprites 0 and 1 and sprites 2 and 3 (1), between sprites 2 and 3 and sprites 4 and 5 (2), between sprites 4 and 5 and sprites 6 and 7 (3), or beneath sprites 6 and 7 (4). You have the same possibilities for playfield 2.ยป
@chainq I also spotted a set of #Amiga ROM Kernel Reference Manuals.
They were incredibly well written and beautifully typeset. Could you post a few diagrams from the Graphics and Intuition sections?
I remember one in which they used the fingers of two stylized hands to illustrate the relative priority of sprite groups versus bitplanes in dual-playfield mode.
You can guess from the number of contributors that standardization took considerable effort. The API must support a variety of use-cases, including hybrid rendering.
Ray queries are the simplest of two available techniques, in which SPIRV shaders can cast arbitrary rays and get back a (distance sorted?) list of hits.
The first step to understanding real-time #RayTracing involves a leap to #Vulkan, which generalizes the old #OpenGL rendering model to enable building parallel, multi-pass pipelines using low-level primitives such as command buffers, queues, etc.
Back to the hardware RT: for a long time, GPUs could dispatch the parallel execution of short functions (historically called shaders). Each instance gets constants such as texture data and variables such as transformed coordinates. The output of these functions can be used to paint pixels on the screen or intermediate buffers.
Some old RT algorithms switched the two nested loops: the outer loop iterates over each triangle in the scene, and the inner one tests for intersections with screen pixels.
The advantage is that you only need to consider the pixels within the projection of the triangle.
You can also completely cull triangles completely occluded by other triangles in front of them.
I don't know whether this improved algorithm is actually a win for scenes with millions of triangles.
@penguin42 The schoolbook #RayTracing algorithm has uneven per pixel workload, making parallelization ontothousands of compute units inefficient.
Furthermore, computing the intersections requires access to the entire scene. Efficient data structures tend to be trees of bounding boxes, with recursive lookups.