You can look at it two ways: Graphics card manufacturer's fault or Software application's fault
Actually, I believe it is mostly the graphics card's fault for not fully supporting the OpenGL standard in all cards (including their consumer level cards). The exact same issue is present in SolidWorks, 3D Studio, and other professional 3D software.
I assume there is some sort of GOOD reason for the application programmers to 'limit' their software in this manner. If not, why do they all do it? If they could correct this issue and allow lower cost consumer level graphics hardware without any adverse effects overall, they would. They would only gain market share.
No, I am pretty sure the blame is squarely on the hardware manufacturers (nVidia, ATI, and in your case, Intel). They are just using some classic marketing techniques to make more profit. Here is the scenario: Consumer graphics is a high volume, EXTREMELY competitive market where cost generally rules above performance. Professional graphics a lower volume, less competitive market, but (more importantly) performance rules above cost. What manufacturers end up doing is making a chip that can do both consumer and pro applications. Then, to compete in the consumer market AND make a good profit in the pro market, they disable some key pro features in the consumer versions of their product. Thus, they can sell LOTS of consumer products at a low price, and they can sell fewer pro products at a high price.
This is done all the time in any industry. For example, Intel disables some features on its Pentium and calls it a Celeron. John Deere has different horsepower tractors where the only difference is how they adjust the fuel mixture (one little screw or software setting).
There are all sorts of examples like this...