Question about how best to limit input voltage to opamp

I have an application where the input can be as high as 21VDC; I want to do a non-inverting stage, powered by 24VDC with a gain < 1 to bring
it within a 3.3V range before feeding it to an ADC whose rail is tied to 3.3VDC. By selecting the right gain resistors, the ADC should never see a voltage greater than 3.3VDC, but I want to put in more protection in case something happens and the ouput of the opamp stage swings the input to the ADC beyond 3.3VDC and destroys the ADC.
One possible way that I've seen it done is to tie the output of the opamp gain stage to the anode of a diode whose cathode is tied to 3.3VDC. Please confirm if this is a good way of doing this, any caveats I should be aware of or if there are any other ideas you can think of.
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload

I would put a simple zener across the input to be protected.
Add pictures here
<% if( /^image/.test(type) ){ %>
<% } %>
<%-name%>
Add image file
Upload

Polytechforum.com is a website by engineers for engineers. It is not affiliated with any of manufacturers or vendors discussed here. All logos and trade names are the property of their respective owners.