i hope more overlapping regulations than what energy star covered
bobmcnamara 1 days ago [-]
Energy star is going away.
fn-mote 1 days ago [-]
Author doesn’t even compare it to a second solution.
Interesting to know, but I just use a hot key to attempt reconfiguration if something goes wrong. Works for me even if it’s not a sign Linux is ready for non-technical users.
secure 1 days ago [-]
Yes, I didn’t want to analyze and compare different solutions, I just wanted to share the joy of finding a solution that works well for me.
Using hot keys is nice, but hot keys (intentionally) don’t work while my screen is locked. I contemplated mapping an xrandr call onto a smart button (Shelly Button 1, essentially triggering an HTTP request), but in the end grobi has the same effect and is even more convenient than having to press buttons.
wokkel 1 days ago [-]
My windows laptop disconnects my monitor sometimes seemingly randomly. A sign that windows is not ready for non-technical users?
JCattheATM 1 days ago [-]
Hardly, given the mountain of evidence to the contrary.
conception 1 days ago [-]
You’ve never had to support windows non-technical end users I see.
JCattheATM 1 days ago [-]
That's a silly assumption, and a silly point. By that reasoning, no OS is suitable for non technical users.
aaplok 1 days ago [-]
> By that reasoning, no OS is suitable for non technical users.
That was the point GP was trying to make (a bit snarkily and sarcastically) in response to the argument that Linux is not suitable for nontechnical users.
JCattheATM 23 hours ago [-]
Right, but GP is dead wrong. Windows and MacOS are far more suitable for non technical users on average. Mint is great for people that have standard hardware and need nothing more than a browser, but it's still not on par with the big two.
justinrubek 9 hours ago [-]
I can't really agree with that on windows. When I've had to use it, I've always had gnarly issues that I don't think I'd be able to deal with if I wasn't knowledgeable about computers. Sometimes, even then, basic functionality just doesn't work at all. It's further complicated by the software not giving proper insight into what is going wrong so as to make it impossible to deal with.
JCattheATM 5 hours ago [-]
It can have issues, sure, but there is a reason it's on 90% of computers and has been for 30 years or so - it's still easier than most of the competition.
aaplok 22 hours ago [-]
That Linux is less accessible than windows or MacOS may be true (I personally agree with you about windows, less so about macOS), but if an argument is not acceptable about windows, it can't be accepted about Linux either. If both OSs seem to suffer from difficult-to-fix issues when turning monitors on and off, that can't be the reason why Windows is more non-tech friendly.
I think that this is the bulk of the argument here.
JCattheATM 5 hours ago [-]
> but if an argument is not acceptable about windows, it can't be accepted about Linux either. I
I think it can, because even if there is an issue on Windows, it's likely still going to be much easier to resolve than on Linux, e.g. no editing files, no command prompt, etc.
conception 18 hours ago [-]
No one up to iOS v14 or so ever asked me for help on using it or really fixing it. MacOS passed the grandma test for me and generally requires less user support. But iOS under jobs was a gold standard in usability.
JCattheATM 5 hours ago [-]
I've found non tech users find MacOS far less intuitive honestly. I do think the Windows paradigm is probably the most intuitive, with a startbar launcher and apps being clearly separated in the taskbar and not grouped together under a bouncing icon.
arghwhat 1 days ago [-]
> Does grobi work on Wayland?
See kanshi, which has a similar rule matching approach.
modzu 1 days ago [-]
im sensitive to coil whine and i hear it everywhere : computers, light bulbs, phone chargers, you name it and if im in the same room as electronics i hear a high pitched squealing that others seem not to notice or care about. its inescapable and it sucks
bobmcnamara 1 days ago [-]
I used to be. One high speed camera I could tell the frame rate consistency from the flash recharge whine.
But one day a young engineer asked if I could hear my circuit when the load changed.
I could not. I have become what I hated. The cycle continues.
Please stop. Let other people have their things. It's disruptive and the kind of behaviour that makes HN toxic.
secure 1 days ago [-]
Thanks for the advice. I try switching to Wayland every year, but it has never worked without heavy graphics artifacts / flickering / glitches on any of my machines (I use an nVidia GPU so that I can drive my Dell UP3218K monitor).
Meanwhile, X11 works really well for me. No tearing, no artifacts, no breakages on upgrades. Really can’t complain.
Maybe next year.
pabs3 19 hours ago [-]
Yes, multi-protocol display servers like Arcan are the future.
There are still plenty of reasons to use X11, mostly for software that doesn't support Wayland yet.
alabastervlog 1 days ago [-]
Or: stable and finished.
rollcat 1 days ago [-]
Either that, or a dead end. X11 stayed afloat thanks to the endless extensions, in particular Xcomposite. Windows and OSX have had full modern graphics stacks for 20-25 years, by default. OSX in particular has provided backwards compatibility in a similar fashion as XWayland does (via Carbon).
Every developer who cared about X11 has moved on; it receives little to no maintenance. We already have hardware/software where X11 is entirely unsupported. It's likely we'll see more in the future.
bitwize 1 days ago [-]
Display tech keeps evolving. Proper scaling on HiDPI displays, HDR -- these are things X just doesn't (and will never) have support for because of its architecture. Wayland is the path forward. ALL of the developers who know anything about the display stack have committed to developing for Wayland and deprecating X.
ryao 18 hours ago [-]
I find the Xorg X11 server works just fine with HiDPI displays, while Wayland has issues (mainly just zoom). Alan Coopersmith is not on the Wayland bandwagon as far as I know.
We might get HDR from the Xorg X11 server one day. Once it is in place for everything else, we would just need a X11 extension to hook all of that into X11.
bitwize 7 hours ago [-]
You can't just bung an extension in to do HDR. You have to change the X11 protocol itself. This is because the X11 protocol was designed way back in the 80s, when it was completely unfathomable that anyone would need more than 32 bits per pixel. So pixel values are CARD32s (unsigned 32-bit integers) in X11. So yeah, I guess theoretically you could do 10-bit-per-channel HDR in the current protocol, but not anything more. And forget about floating point.
There are NO plans by any of the Xorg maintainers to add HDR support to X11 or the Xorg server, for this reason. There probably never will be.
Window coordinates are worse -- they are INT16s (signed 16-bit integers). So any window, including the root window, can have a maximum size of 32,768x32,768. That's as big a screen as you can get in X11. You sit four 8k monitors next to each other, you start getting close to that hard limit. (Don't dismiss it out of hand. I once worked with a guy who needed no less than five monitors. It was just how he worked. And that's not even addressing things like large wall-sized displays for meetings, war rooms, and the like. Once those start getting high-res, they become beyond X11's ability to handle.) Again, simply extending the protocol won't help -- the protocol must be changed.
Well, while we're changing the protocol, why not do some housekeeping? X11 wasn't designed with security in mind, so let's address that from the jump in our new protocol. Also, modern toolkits do all their rendering client-side, relying on OpenGL or Vulkan to provide GPU-accelerated rendering. All you really need the display server for is to manage the rendering surfaces, composite them together on the final display, and handle user events. So let's start by eliminating all those draw calls from the 1980s that were designed for 1980s displays and not modern GPUs, and pare down the protocol to what's necessary for the above, to avoid bloat and legacy cruft.
Oh shit, we just invented Wayland.
Wayland is the changes to X11 that are necessary in order to move forward.
anthk 1 days ago [-]
We should be using Rio then?
bitwize 4 hours ago [-]
I wish. Unfortunately, most operating systems do not have the advanced file system malarkey that Plan 9 does, which is kinda necessary for a window system like rio to work well.
raverbashing 1 days ago [-]
So once again you need to DYI your monitor configuration for Linux that for some reason works out-of-the-box pretty much in Windows and MacOSX
sigh
And that's for X11, which was built in a 70s model while Wayland leisurely moves forward
fsh 1 days ago [-]
Both Gnome and KDE handle (un-)plugging external monitors just fine. And Wayland has been the standard in all relevant desktop distributions for a couple of years now.
secure 1 days ago [-]
The Dell UP3218K monitor I describe does not work “out of the box” on any OS. Even finding a GPU that can drive it at all is tricky.
p_l 1 days ago [-]
It should work with tiled output, so long as a) it reports tiled geometry in DisplayID b) the driver handles it right.
Then it should show up as single display.
It's also how Apple XDR display presents itself to MacOS (two DisplayPort 1.4 tunnels over USB4, tiled layout in DisplayID).
I suspect it's possible that it doesn't have a valid tiled geometry block, but that's something that was already handled right when first 4k displays landed, so...
ikurei 1 days ago [-]
I use both Linux and Mac, and in my experience Mac's handling of multi-monitor setups and, specially, of them changing, is only slightly better than Linux's.
For most situations you do not need to do anything difficult to plug any number of monitors to a Linux computer with a modern, full-featured distro, other than arranging them. Mac does a better job of remembering your setup and adapting to a monitor disappearing, but it's not that much better.
I'm still not sure I understand why the author needed this tool, may be because they have more than one computer plugged into the same monitor?
1 days ago [-]
1 days ago [-]
ryao 18 hours ago [-]
X11 was made in September 1987, long after the 70s. All of the older ways of doing graphics were killed by it due to technical merit.
raverbashing 16 hours ago [-]
Thanks I didn't realize it was so new
> older ways of doing graphics were killed by it due to technical merit.
The issue here is that a server/client architecture complicates things a lot when it's all the same machine and the security model is different
sprash 1 days ago [-]
> while Wayland leisurely moves forward
Debateable. But it sure started with huge step backwards. On X11 all relevant functions are at least standardized within the xrandr protocol. On Wayland you don't even have that. So it really depends on the compositor if it works or not where each is doing its own thing which is just crazy. I prefer the 70s standardization model of "mechanism, not policy".
That's absurd. There are regulations on standby power.
https://dl.dell.com/manuals/all-products/esuprt_electronics_...
>Power Consumption
>0.2 W (Off Mode)
>0.3 W (Standby Mode)
Doesn't seem to be an isolated case:
https://www.dell.com/community/en/conversations/monitors/up3...
>UP3216Q, drawing 23 watts in Standby? (2019).
I guess a takeaway from OP is to measure your actual standby power draw.
http://monitorinsider.com/displayport/dp_pin20_controversy.h...
https://www.dell.com/support/kbdoc/en-us/000132935/the-20th-...
Interesting to know, but I just use a hot key to attempt reconfiguration if something goes wrong. Works for me even if it’s not a sign Linux is ready for non-technical users.
Using hot keys is nice, but hot keys (intentionally) don’t work while my screen is locked. I contemplated mapping an xrandr call onto a smart button (Shelly Button 1, essentially triggering an HTTP request), but in the end grobi has the same effect and is even more convenient than having to press buttons.
That was the point GP was trying to make (a bit snarkily and sarcastically) in response to the argument that Linux is not suitable for nontechnical users.
I think that this is the bulk of the argument here.
I think it can, because even if there is an issue on Windows, it's likely still going to be much easier to resolve than on Linux, e.g. no editing files, no command prompt, etc.
See kanshi, which has a similar rule matching approach.
But one day a young engineer asked if I could hear my circuit when the load changed.
I could not. I have become what I hated. The cycle continues.
Please stop. Let other people have their things. It's disruptive and the kind of behaviour that makes HN toxic.
Meanwhile, X11 works really well for me. No tearing, no artifacts, no breakages on upgrades. Really can’t complain.
Maybe next year.
https://arcan-fe.com/
Every developer who cared about X11 has moved on; it receives little to no maintenance. We already have hardware/software where X11 is entirely unsupported. It's likely we'll see more in the future.
We might get HDR from the Xorg X11 server one day. Once it is in place for everything else, we would just need a X11 extension to hook all of that into X11.
There are NO plans by any of the Xorg maintainers to add HDR support to X11 or the Xorg server, for this reason. There probably never will be.
Window coordinates are worse -- they are INT16s (signed 16-bit integers). So any window, including the root window, can have a maximum size of 32,768x32,768. That's as big a screen as you can get in X11. You sit four 8k monitors next to each other, you start getting close to that hard limit. (Don't dismiss it out of hand. I once worked with a guy who needed no less than five monitors. It was just how he worked. And that's not even addressing things like large wall-sized displays for meetings, war rooms, and the like. Once those start getting high-res, they become beyond X11's ability to handle.) Again, simply extending the protocol won't help -- the protocol must be changed.
Well, while we're changing the protocol, why not do some housekeeping? X11 wasn't designed with security in mind, so let's address that from the jump in our new protocol. Also, modern toolkits do all their rendering client-side, relying on OpenGL or Vulkan to provide GPU-accelerated rendering. All you really need the display server for is to manage the rendering surfaces, composite them together on the final display, and handle user events. So let's start by eliminating all those draw calls from the 1980s that were designed for 1980s displays and not modern GPUs, and pare down the protocol to what's necessary for the above, to avoid bloat and legacy cruft.
Oh shit, we just invented Wayland.
Wayland is the changes to X11 that are necessary in order to move forward.
sigh
And that's for X11, which was built in a 70s model while Wayland leisurely moves forward
Then it should show up as single display.
It's also how Apple XDR display presents itself to MacOS (two DisplayPort 1.4 tunnels over USB4, tiled layout in DisplayID).
I suspect it's possible that it doesn't have a valid tiled geometry block, but that's something that was already handled right when first 4k displays landed, so...
For most situations you do not need to do anything difficult to plug any number of monitors to a Linux computer with a modern, full-featured distro, other than arranging them. Mac does a better job of remembering your setup and adapting to a monitor disappearing, but it's not that much better.
I'm still not sure I understand why the author needed this tool, may be because they have more than one computer plugged into the same monitor?
> older ways of doing graphics were killed by it due to technical merit.
The issue here is that a server/client architecture complicates things a lot when it's all the same machine and the security model is different
Debateable. But it sure started with huge step backwards. On X11 all relevant functions are at least standardized within the xrandr protocol. On Wayland you don't even have that. So it really depends on the compositor if it works or not where each is doing its own thing which is just crazy. I prefer the 70s standardization model of "mechanism, not policy".