I don't see the need to tie OS support to hardware support. It's totally plausible to say, for example, "we'll support users on Debian 6 but only if they have a <10 year old graphics card"
According to Wikipedia, OpenGL 3.x support is available for cards as old as: * NVIDIA GeForce 8 series (12 years old) * ATI Radeon HD 2000 series (11 years old) * Intel Sandy Bridge integrated graphics (8 years old) Does the project have any official hardware age/spec cutoff at this point? I don't think the 11/12 year age cutoff for discrete graphics is unreasonable. The only users we'd impact by switching the OpenGL cutoff to 3.x are users of 11+ year old discrete GPUs or 8+ year old integrated GPUs, who either are on ancient laptops (so no GPU upgrade possible) or have a desktop but can't spare the cash for a cheap GL 3.x capable card. Considering that Newegg lists an ancient Radeon HD 3450 (which supports OpenGL 3.3) for 11 USD, this doesn't seem like a major burden. On 11/05/2019 23:58, Andrew Lutsenko wrote: > Is it possible to determine openGL hardware support at runtime and use > advanced API on newer machines while switching to fallback for older ones? > I believe that solution would be best of both worlds. > Otherwise the only reasonable cut-off date would be when officially > supported OS versions will not support hardware with given OpenGL > version. (For hypothetical example if kernel 7.0 will drop driver > support for HD2000 and the likes there is no sense in clinging to old > api but graphics drivers live very long time). > > Regards, > Andrew > > On Sat, May 11, 2019 at 7:33 AM jp charras <jp.char...@wanadoo.fr > <mailto:jp.char...@wanadoo.fr>> wrote: > > Le 11/05/2019 à 16:18, Jon Evans a écrit : > > Hi JP, > > > > Thanks for the input, it's a good point. It sounds like in this case > > there may be a technical solution that works fine in GL2.1 so I > agree it > > makes sense to avoid raising requirements if at all possible. > > > > Do you have any thoughts about how we should handle this in the > future, > > if for some reason there is a technical challenge that cannot be > solved > > as easily without moving to a higher OpenGL version? (or some other > > similar hardware support issue) Should we always design for 2.1, or is > > there some time in the future when it becomes appropriate to expect > > users to have something newer? > > > > Best, > > Jon > > Of course, if OpenGL 2.1 cannot solve a technical challenge, we can move > to 3.0. > > Moreover, in the future, more PCs will support 3.0. > > However, generally speaking, I prefer better algorithms to more powerful > PCs. > > > > > > > On Sat, May 11, 2019, 07:27 jp charras <jp.char...@wanadoo.fr > <mailto:jp.char...@wanadoo.fr> > > <mailto:jp.char...@wanadoo.fr <mailto:jp.char...@wanadoo.fr>>> wrote: > > > > Le 10/05/2019 à 18:43, Jon Evans a écrit : > > > Does anyone have a good sense of which hardware / software > platforms > > > would be impacted by a switch to OpenGL 3.0 as baseline > requirement? > > > > > > As far as I am aware, all commercial tools in the space have > more > > > advanced / modern system requirements than KiCad, with the > possible > > > exception of Eagle. We have to consider whether supporting old > > > graphics cards goes counter to the desire to have KiCad > handle more > > > professional use cases (including large designs). > > > > > > The integrated Intel GPUs that are old enough to not have OpenGL > > 3.0 are > > > no longer supported by Intel (everything since HD2000 series has > > it, as > > > far as I know) > > > > > > -Jon > > > > Hi Jon, > > > > To be honest, I do not share your opinion, and I am not especially > > thrilled by switching to OpenGL 3.0 as baseline requirement as > long as > > we can avoid it. > > > > OpenGL 2.1 is a reasonable requirement. > > > > What is a professional use case? > > I know at least 2 opposite cases: > > - A advanced user who designs very complex boards: he needs a > recent and > > fast computer with 2 (or enev 3) monitors. > > - Classroom "users" > > They are also professional users who do not design very > complex boards. > > But force updating the computers just to use Kicad is a > serious issue: > > As a old teacher I am knowing what I am saying: > > In the department I worked before being retired, we have > roughly 200 PCs > > (most of them are dual boot: Linux and Windows). > > Not of all are used to run Kicad, but many of them. > > Saying: our new Kicad version needs a recent computer+OS so > you have to > > change 50 or 100 of your computers in a bit hard for these users. > > > > > > > > On Fri, May 10, 2019, at 12:33 PM, Tomasz Wlostowski wrote: > > >> Hi, > > >> > > >> I've been recently playing with Victor's huge 32-layer PCB > design and > > >> trying to improve the performance of pcbnew for larger > designs. This > > >> board causes even pretty decent PCs to crash/render > glitches due to > > >> pcbnew's enormous VBO (Vertex Buffer) memory consumption. > > >> > > >> It turns out it's caused by the way KiCad renders filled zones: > > >> - the inside of a zone is drawn/plotted as a filled polygon > with > > 0-width > > >> boundary. This one not a problem - we already triangulate the > > polygons > > >> and I recently developed a patch for the OpenGL GAL that allows > > reusing > > >> vertices of triangulated polys in the VBO/Index buffer to > further > > reduce > > >> memory footprint. > > >> - the thick outline is drawn with rounded segments with the > width = > > >> minimum width of the polygon. Since we don't have arcs in > > polygons, each > > >> of round features (e.g. vias) surrounded by a zone gets a > ton of tiny > > >> segments in the polygon outline. Each rounded segment in > OpenGL is > > >> composed of 2 triangles, hence 6 vertices (that can't be > > reused...). For > > >> Victor's board it means 1 GB (sic!) of the VBO goes for > outlines > > of the > > >> polygons alone. Disabling the outline drawing makes the > renderer work > > >> smooth again. > > >> > > >> I've been experimenting with some ways to fix this: > > >> - generating the thick outline strokes using a Geometry > Shader (which > > >> means bumping up GL 3.0+), which means farewell to many > Linux/older > > >> integrated Graphics users. > > >> - caching a triangulated polygon which is a boolean sum of > the filled > > >> inside and the thick stroked outline. This takes a lot of > time (~2 > > >> minutes for Victor's design) to load and still takes quite > a bit > > of VBO > > >> memory. Another downside is that the polygons are not fully > WYSIWYG > > >> (outline segments have true rounded corners, while the > corners of the > > >> displayed shape would be approximated with line segments). > > >> - change the way KiCad handles filled zones to calculate the > > (stroke + > > >> inside) boolean sum during zone filling process. It means > changes > > to the > > >> plotting/GAL/3D code, but no changes to the file format. We'll > > also be > > >> forced to inform the users that they have to refill the > zones if they > > >> read a file generated by an older KiCad version. > > >> > > >> > > >> Which solution would you prefer? > > >> Cheers, > > >> Tom > > > -- > Jean-Pierre CHARRAS > > _______________________________________________ > Mailing list: https://launchpad.net/~kicad-developers > Post to : kicad-developers@lists.launchpad.net > <mailto:kicad-developers@lists.launchpad.net> > Unsubscribe : https://launchpad.net/~kicad-developers > More help : https://help.launchpad.net/ListHelp > > > _______________________________________________ > Mailing list: https://launchpad.net/~kicad-developers > Post to : kicad-developers@lists.launchpad.net > Unsubscribe : https://launchpad.net/~kicad-developers > More help : https://help.launchpad.net/ListHelp > _______________________________________________ Mailing list: https://launchpad.net/~kicad-developers Post to : kicad-developers@lists.launchpad.net Unsubscribe : https://launchpad.net/~kicad-developers More help : https://help.launchpad.net/ListHelp