OK.. so the crux of the suggestion, is that digital cameras; not recording an image, but a gazillion light level readings; with sufficient sensitivity, and range; then 'exposure'... I am going to assume a qualification here.. exposure VALUES.. don't really matter.... So far... err... yeah, I am sort of with you; and inclined to agree on a very very broad 'principle'.
EV's (as marked on my hand held light meters) never 'really' mattered anyway; Before standardized commercial emulsions, photographers assessed lighting by eye, and made a best guess of the 'settings' to use for the scene they saw and how strong they had brewed the stuff they smeared on their glass plates.
When standardized commercial photo-emulsions became available, 'settings' were still usually decided upon by assessing the scene's lighting 'by eye', and it was only when selenium cell light meters became available; that EV's were invented.
Scientific 'standard' unit of light intensity is the Lumen.. I have a rather nice Lux light-meter some-where, it is almost utterly useless for photography, trying to translate LUX readings into something 'useful' to me to decide on camera settings; Hence the EV scale is a contrivance of convenience, that condenses the complex logarithmic nature of light intensity, to a more useful scale, that can be more readily transcribed from an EV value, into Shutter-Speed, Aperture-Setting and ASA/ISO value.. in conjunction with the aperture similarly being 'condensed' for convenience from an actual linear diameter of a hole, to a ratio proportional to a lens' focal length.
There never was a scientifically calculable 'correct' exposure value; on film, there was simply a range of EV's over which a more or less 'acceptable' image may be formed, within the dynamic range of the capture medium, and it was always a matter of personal discretion where within that range any-one though that the EV was best cantered.
Hence, migration to digital technology, de-coupling the issue of 'image making' from the point of capture, to the point of display, recording not an 'image' but an array of individual light level intensity across a scene, from which an image may be rendered;
IF you have sufficient sensitivity, to be able to accurately record the very lowest light level intensity of individual 'pixel' of a scene, and across sufficient dynamic range, that the computer can record a 'meaningful' value, everywhere and anywhere, that it doesn't matter where the 'exposure value' is centered, you can take that data, and center the brightness of an image produced from it, anywhere you like, without losing either shadow or highlight detail at either end....
THEN in an incredibly idealised 'perfect' system, then, well? The Exposure Value 'may' become something more of a redundancy and a less useful 'convenience' than it has ever been... but doesn't necessarily make it 'redundant'.
What such an 'idealised' system MAY do is make the ISO sensitivity of a sensor or film-speed 'irrelevant'.. and THAT is about all.
You would have a system, in which you have a 'universal' ISO, you didn't need to adjust when you made an 'exposure'.. to 'centre' the EV...
Variable ISO settings in digital, are mimicking what used to be done slightly more laboriously with film, or even preparation of glass-plate emulsions, and merely providing means to over-come the 'problem' that there has not yet been a 'universal' ISO sensitivity emulsion or sensor, and offer a mechanism by which you can adjust the center of the Dynamic-Range to better suit the lighting situation you are contending with.
Digital systems, have made this rather more convenient and easy to achieve, adjusting the amount of signal amplification from individual receptors on a Sensor-Array, compared to brewing a completely different strength emulsion of silver-nitrate and egg-white... but little else; and worth noting that the actual 'mechanics' of the ISO dial on a digital camera isn't actually doing very much different to the button on my old selenium cell Leningrad EV meter, that switched the scale between 'Indoor' and 'Outdoor' changing a resistor value in the Wheatstone bridge circuit of the galvanometer, to increase the needle movement when the sensor voltage was low in dim light, and reduce it when it was high in bright light!
Personally I am NOT convinced, that modern Digital sensors ARE peculiarly more sensitive than halide film, and that they can 'accurately' record such a high range of light-level readings across such a large dynamic range, EVEN as film, let alone, one SO wide as to provide a UNI-Sensor, that could cope with all situations and light levels without manual intervention to centre an 'exposure'.
YES, modern digi-sensors are amazing; my DSLR has an ISO sensitivity range that starts at a nominal ISO-100, and goes all the way up to ISO-25,600 or something in that order... BUT, by means of signal amplification, which amplifies signal distortion, or 'noise'.
THAT sort of suggests that even the best current dig-sensors, for all the low light sensitivity they may possess, and the ability they may or may not have to record an enormous dynamic range, towards making EV cantering some-what redundant... they DONT record the light levels all THAT accurately, and certainly not accurately enough, that they are anywhere near achieving the range of sensitivity and accuracy that would be required of a practicable 'Universal ISO Sensor'....
They are, currently only just able to mimic the capabilities of silver halide, and as far as dynamic rage is concerned, thanks to Digital-Sampling 'Threshold' clipping, systems are STILL far from achieving the subtle rendering between light levels that film may record...
Potential to produce a universal sensor MAY be there; but, it would require sensors that are far more sensitive and far more accurate than even the current generations available, and more still, far more 'refined' sampling schemes to exploit them, than the 16-bit per channel employed my most digital imaging of the last two decades or so.
It 'may' be theoretically possible, it may even be technologically possible... BUT in the last ten or fifteen years of Digital Image evolution, there has been very little movement towards it; whilst camera makers have been striving not to 'improve' upon basic digital sampling algorithms, conceived and standardised, what, twenty years ago? But, dumbing down to a mass-market 'Acceptable-Quality-Level', and striving for cost savings over significant performance enhancements.
So, again, personally I don't think that a uni-ISO sensor is something that is here, or in development, less still likely to be a practical commercial viability, in the foreseeable future.
Meanwhile.. "Exposure".. practically is picking the 'settings' of ISO-Sensitivity or Film-Speed, Shutter-Speed and Aperture, not just to suit the ambient light intensity, but the scene as a whole, and the subject within it, to make a picture, not just with a well centered average brightness, but to control Depth of Field, and motion blur, too
IF the sampling scene, is powerful enough it records changing light-levels at small enough sampling periods, say, 10Khz or more, for a period of what, 30 seconds? The range of shutter-speeds of my DSLR, to complete a data-set that contains effectively a series of exposures at a shutter speed above 1/10,000th, that can then be 'exposure stacked' to 'streak' a subject across a frame, or not, in post process; then that 'might' make the shutter-speed setting less relevant at Point-of-Capture... in a similar manner to a 'Uni-Sensor' making the ISO setting irrelevant, at point of capture, and in post-process, you may be able to re-select 'effective' settings to 'make' an alternative 'exposure' after the event.... but, would leave the data-set still dependent on aperture setting.. now again, that could be made irrelevant, if the system sampled the scene at all possible aperture settings on the lens, to likewise allow those to be exposure stacked in post processing..
NOW what you are talking about is not just a Uni-Sensor, but a dynamic, multi-scanning camera, that is sampling a scene, at a frequency, not just in excess of 10Khz to compile a data-set to cover range of shutter speed effects, but maybe 24x that frequency to do so at all available aperture settings, AND at something probably in excess of 64bit-per channel sampling levels...
May be possible to push the electrickery of a sensor array to obtain that sort of sampling frequency, even obtain something close to the required sensitivity and accuracy, at that short a sampling frequency, possibly even 'cheat' somewhat, using interpolation and comparison, between samples, to 'render' that many effective exposures that fast.. but the aperture remains a physical hole, making an electronic iris react THAT fast, I suspect WOULD be something of a challenge!
But hey; how big is a 16bit per channel, 24 Mpix 'Raw' image file? Now inflate that, to 64bits per channel; then inflate it again, to include both focus-stacking data, and shutter-speed stacking data, ALL at similar 64bit per channel sampling levels?
Computer electronics have come a heck of a long way the last 25 years; but that sort of sledge hammer processing power isn't infinite; and remember, it's the very nature of digital sampling that is the 'problem' here, with threshold clipping 'condensing' the data at source; so even greater sampling compression, or clipping, to contend with actually inflating that bulk of data to achieve some sort of redundancy of point of capture exposure-settings, is actually perverse to the objective.
So... conclusion is, NO I don't think that the 'term' exposure is obsolete, as it has come to be commonly used... it has always been a rather vague and ambiguous description, of what you do at the point of capture in the image making process.. or synonym for an 'image' itself, as well as abbreviation for 'exposure settings' or 'exposure value'.
It has 'evolved' from simply meaning taking off the les cap and 'exposing' the glass plate to image forming light; Changes brought about by dint of 'Digital' technology MAY see the terms usage evolve differently, depending on what that alternate technology begs of us at point of image capture, but it's unlikely, that the term will be abandoned.
Meanwhile; I REALLY don't believe that Digital Systems have reached the point that the they are genuinely both sensitive and accurate enough to even make ISO settings irrelevant, at point of capture; let alone that they may become so refined that sampling rates high enough to make shutter speeds irrelevant, let alone aperture settings, that there is, even in the future, very much hope that we can rely on bulk data capture, to cover any and all eventualities, and record enough data, that we can make all our 'exposure' decisions, in post-process, after the event, from some enormous, all encompassing data-file, covering all extremes of both the dynamic range, as well as the focus field, and the shutter range, derived not just of an incredibly evolved super-sensor, but also some pretty evolved super-high frequency sampling schemes that can also sample all variables of both shutter speed and aperture... IN a fraction of a second!
Should we perhaps revise our interpretation of what IS an 'exposure'?
Well utterly different question, really; but back to basics it IS simply the practice of taking off the lens cap and letting the cat see the rabbit; whether the cat is a glass plate, a bit of celluloid, or a silicon chip, and whatever the rabbit happens to be, anything from a landscape to a portrait to a macro image of an insect and all manner of subject in between or beyond...
It is already a term that is born of legacy, to denote the 'moment of image capture', and in many camera's the 'sensor' has probably been 'exposed' to the scene for however long, before you press the shutter release to prompt a data-capture of it, especially if you are using the preview screen not the optical view-finder!
Compounding of the term to cover the Exposure Value, and the cantering of that EV in an image, or more, actual Exposure Settings, IS merely accepting a confusion of terminology born of abbreviation, failing to sufficiently differentiate between the 'moment' of exposure, the 'exposure value' and 'exposure settings'...
That does not require the term 'exposure' to be consigned to obsolescence, or be abandoned, or for its definition to be changed; merely for people to exploit the fantastic quality of the English Language to be explicit and accurate, and use the term less ambiguously or errantly, whether by context or qualification, in order to indicate whether they are talking about the act of exposure, exposure value or exposure settings, or even an actual rendered image, that was made by 'exposure'?!?
As to the 'technology'... how that has evolved over the last two hundred years, and how it is likely to evolve over the next two hundred or more.. may influence the methods and practices of how we create photographic images, but that is a separate matter to how we employ Language to describe it.
IF Digital technology evolves to the degree that a uni-sensor becomes a viable reality, whether that may make ISO settings an irrelevance, whether high-frequency sampling could make shutter and aperture settings a similar irrelevance, and an 'image exposure' could be create entirely in post process from an excess of image date to cover all eventualities? Well, it could be done. I don't think that it will be, certainly not in the foreseeable future, but even if it were, I REALLY don't see people in general, abandoning the term 'exposure'.. or using it any less ambiguously as they do now!