The Year 2030
by Dr Nick van Terheyden (@DrNic1)
How will the world of medicine change in the next 15 years? Well 15 years ago AOL had just bought Time Warner, the human genome had just been deciphered and published and the first inhabitants of the International Space Station had arrived.
The Year 2030 – my bed has been tracking my vital signs throughout the night and notices I was restless and managed fewer REM cycles during sleep than usual. Prometheus (my personalized artificial automated agent) checks my calendar and traffic and elects to wake me an hour later. Appointments for the morning are rescheduled and my drone pick up is postponed. Prometheus sends an update to “Hestia” (my kitchen AI) with instructions to increase the energy component of my meals for the day to adapt for the lack of sleep and deliver a boost of energy with almond snacks through the day. Prometheus sends my updated sleep and vitals data to my personal health record. While I rest peacefully the rest of the household is awakened and sets about their day.
Time to Get Up
When it’s time to awaken, the bed starts warming to ease the process, the lights slowly turn on and the GPR (Galactic Public Radio) custom news cycle is playing gently in the background. My calendar has been reorganized, and there’s an additional appointment with Asclepius (My health AI) before I leave in the morning. My food is ready and waiting and contains a boost in energy, helping me wake up and acclimate after the poor night’s sleep. I hear the inbound calling for Asclepius and take the call. We review the reasons for my poor night’s sleep and agree I should track this more closely for the next few days to ward off any potential problems. In this instance Asclepius suggests no further investigation is warranted, but if I am worried a drone will be dispatched with some auto investigator tools to apply and track additional parameters if necessary.
As we finish my personal drone arrives and I step outside, catching my foot on a fallen replicator brick discarded by one of the children. As I fall my head strikes the corner of a table and carves into my cheek. Prometheus is immediately on top of the situation checking on my vitals, and while no major damage to my body, the cut will need review and probably some stitches. Checking with local urgent care facilities, the optimal treatment for me today is a quick trip to the urgent care clinic and my drone is reprogrammed to take me there immediately.
Urgent Care in the Future
As I arrive my MedicAlert Digital Bracelet transmits my allergy to lignocaine and identifies me based on the bracelet
and my retinal scan is taken as I walk through the door, which authenticates my presence and consent initiates a transfer of my medical data and records to the clinic.
I’m guided to a room where a robot nurse cleans my wound and positions me on the bed and brings in the Panacea (the medical repair robot). My medical record shows I have had a recent Tetanus shot, and a comparison of my previous vitals shows there are no serious changes that would warrant additional investigation. Repair completed, my records are updated with the new details and a drone appears to take me to work.
Medical Offices and Care in the Future
As I step into my office my teams are all walking in (virtually) and the central console and screens around the room light up with data on our first patient. We process through the details provided by the various Artificial Intelligence agents and data gathering tools. “Jane” (name changed to preserve her privacy) has been having some frequent dizzy spells and falls – her mother had Meniere’s disease and a degenerative disease linked to the A2ML1-AS1 / ADAM20P1 / MTor Complex 2 / WDFY3-AS2 – we think there may be a link. Even though Jane does not have these gene expressions there may be a new epigenetic influencer she received that is affecting her stable sequence. We need to get to the bottom of this. Jane is here too (virtually) – with her mother and father – and they are looking at the same data, shown with basic annotations to help them understand the details.
We think we have an answer but want to share the details and show Jane and her family the model of the CRISPR editor nanobot and its effects before we decide on the next course of action. Do we create a more realistic model of her body functions with the cell printer and test on that? Or is the confidence in our simulation high enough to warrant immediate therapy? Whatever we decide we will get real-time approval from the GMAA (Galactic Medical Agent Agency that replaced the FDA in 2021). Jane and her family have seen a new therapy advertised and they want to understand how that might work for them. We pull up the details and all the data on patients and do an immediate comparison. The data’s questionable but, more importantly, it’s contraindicated in anyone with GRAMS domain, Heat Shock 70kDa protein expression and several others that disqualify Jane.
We elect a wait and see approach – so much easier these days with the real-time monitoring and detailed data we have on patients that allows us the scope to wait and watch while reassuring patients. Directives are sent to their family “agents” and a drone dispatched to their location with some additional monitors for Jane to wear to give more detailed data on her for the next few days.
As we complete the consultation a drone arrives with my almond snacks and some water – perfect timing.
This post appeared in abbreviated form on SHIFT communication site – and is included in their downloadable ebook
Comments
Comments are closed.
David Webber On October 18, 2017 at 11:51 pm
Not sure I’m rushing to sleep in the that bed!