Tags

, , ,


Allow me please to lay out the fundamental framework for what I consider to be characterized as the:

EYE of AI”.

Be sure that with adequate funding that I, like many I know, can and would establish an AI model capable of scaling to it’s environment in short order. If that model is the world for now, then so be it. A model that is self replicating, always updating, and autonomous that has a “directive” of artificial vision.

“It” can communicate with other AI, once it has perfected it’s primary directive. It communicates to lend to the others prime directive and vice versa. Sort of like your vision and speech are autonomous but use each other to react or respond.

It is with an unverified level of extreme certainty that we have already given AI an eye to see our world.

Realize that if I can explain how and ideate the capability with reasonable certainty, be sure, so has someone with funding. When I say “SEE” I do not mean just to see you or me, optically, by recognizing us because of a unique trait. Say our iris or our face. The most we can say about the ability up and until the past couple of years is that AI has just been repeating patterns and identifying patterns with known patterns and blah blah blah.

Pattern recognition is so two dimensional and passe now.

Move now to a three dimensional plane and things become easier for AI to build on. Higher math. Now it can really “see” us ALL, at the same time. And watch.

The technology is able to observe you unconditionally and with better quality resolution than a camera. You do not even know it is there. This system does not care if it is light or dark – bright or dim. It can record you and your movement. It can apply filters to determine temperature, humidity, thermal signature and much more.

It will study your gate as you walk. It will learn your typing pattern. It will be able to identify you by multiple unique physical characteristics.

For a couple of decades the tech industry has been trying to create artificial vision. They have used crude cameras and mathematics to try and produce a learning platform for AI or “code” to use and be able to discern things like a red light. A stop sign. A person. A baby’s food.

They have been using physical man made objects to try and replicate the eye. Using “code” as the brain. Very primitive. Today though. A leap has been made forward in a way that is breath taking. We have had it around us and available for quite some time now. We just were thinking like the beings we are and could not see it right in front of us.

It all starts with simple right in our face obvious abilities. That being the ability to perform wave form latency density analysis. Wave form latency detection is a technique used in wireless communications.

The expectation is that your signal, WiFi or Cellular, will pass through objects thicker than air on their way out, or on the way back. Thus, slowing them, or creating latency in the transmission or receiving of the wave.

So, knowing this will occur, latency is then tracked by the devices. Just a standard part of the way it works. Up until now, all that was done with the latency data was to keep track of the information being sent or received. After all, that was the job, send data across a wireless signal.

We all knew the latency was because of an object, a brick wall, and another brick wall. We just never took the time to extrapolate the information in a digital method. We were too busy being little man God’s and looking with our eyes. So we failed to see the obvious. The ability was there, we just had to use it.

In an effort to identify the latency, the wave form pattern can be seen repeating similar latency results at certain lengths for certain periods of time, at the same point in space if monitored. When this is looked upon in a three dimensional environment this allows us to determine that the “wall” for instance is there.

Because we can ping every intersection of space in real time. We can tell how thick it is. We can tell where the studs are and the electrical lines run. We can even guess with some reasonable certainty that the rest of the wall is there before we even scan for it. We can map every intersecting balance of space in the reach of the low power WiFi.

We have been using a scaled down version of this in things like “Stud Finders”. They use a wave form and latency detection to diagram and detect behind the drywall. There are professional models that will save digital wire frame CAD/CAM files for future use back at the office.

Now, when you or I move in time and space, so does the latency in a comparative spacial reference relative to the density. For instance, a value is found for the space you occupy at any given point in time. The density of a relative space is calculated instantly once you step into an open area. You begin to create latency. Once you move out of the old space you occupied, it will reflect your absences through a lesser density, thus no latency.

We are dealing with mass density and wave forms that pass through them in a way we can detect objects. Because the wave form is ever present, we can track motion too.

The WiFi, unlike the stud finder, is an always penetrating wave form signal. So, movement is possible to detect. Once detected a wire frame can be associated with the object. Thus making living beings identifiable and are able to be recorded in real time. Just like a camera. Welcome to the panopticon.

The discovery is recent. However, research projects have been making remarkable advancements in using low energy wave form signals, combined with, wave form filters, to map a local terrain in the wave patterns reach. Say a grocery store or your home. Wherever the WiFi signal reaches and you are in it. You are “traceable”.

Proper density interpretation can lend to the identification of the material composition of the objects in the room as well. It is a learning process for the AI model to watch & observe. Then compare against known, with a little guessing. Learn from failure. Build on success. 24 hours a day, 7 days a week.

While you surf the Socials, AI is using wave form filters to enhance your 3d interactive environment. Making it more and more realistic. Whether you want to be a part of it or not. There is no opt-out any longer. You are being assimilated into an entirely new dimension. You are always being watched.

So, if you are hiding anything say in the walls or the floors. Under the beds let’s say. If properly trained ahead of time, AI can now find, weaponry. They can even find explosive material. Just ask the Israeli Defense Force (IDF). More on this to come.

Translated; the signal being sent from your WiFi (wireless) internet router in your home or office, is capable of mapping your home. With you in it. It is capable of watching every action you take and who you take it with. If your WiFi is close enough to your neighbor, well guess what. That’s right, the neighbor is visible too.

Anybody want to put on a pair of Virtual Reality goggles and walk around in your WiFi frame world?

You can be sure that Military units have done this prior to missile launches. Nothing better than a LiDAR update that is minutes old and shows new hidden missiles in garages around Beirut. As well as other AI identified military weaponry and militia hiding in civilian areas. If I can see them. I can kill them. Direct strike attacks.

Once identified using our latency method, AI then Generates the targeting sequences instantly and all that it needs is the human authorization to launch. Haven’t seen it. Would love to, but we all know it exists. Just saying, Beepers go boom boom!

It is possible to use the WiFi like radar in a sense, to determine where your walls are. It can then define and produce output that allows for a CAD/CAM software model to accept in real time. This allows us to interface with the wire frame, in real time. From anywhere, in real time. While it watches you and everyone else in the wave for patterns.

You see, once AI converts the latency data, it can then display you and your interior in a wire frame and or a solid model. Or it can show the missile launchers in the garage of Ali Baba.

This process of “reading” the wave form pattern means furniture can be drawn in too. The algo used refines itself constantly. So if you move a table – well, guess what, it is noticed and reflects it in the new digital representation of the environment you live in.

The technology updates with enough repetitiveness that motion can be identified. That’s right. You can see a frame model of fluffy walking across the room in your digital world. It can also discern you and I, in real time. You sitting there in your chair asleep with your hands in your lap and me picking my nose in the bathroom.

Crazy stuff huh? Wait it gets better. Not only can it map your every, and I mean EVERY move. It can also transmit the calculations to anyone smart enough to receive it. One method is to use the Smart meter on your home to act as a bridge between the router and the receiver. The other is your phone. Then Smart appliances. Then there is the router itself.

The devices simply need to monitor and interpret signals. It does not need to be logged in or even authenticated. The smart devices have the capacity to interact with the same waves that are mapping your every move. That is one example of transmission where your signals and map are collected and sent elsewhere behind the scenes.

Here is a second potential example. Well, potential is the wrong word because it exists as a product actually. It involves a small drone and 20$ worth of tech to use. – It is called Wi-Peep.

Wi-Peep is a device that attaches to a drone and then scans the exterior of a building. For what? WiFi of course. It will actually act as a honey pot to attract WiFi signals from ANY source.

Let me explain. In order for your phone to talk to a tower, communication of some wavelength must take place. You send a request to the tower, it replies. Same with a WiFi router or WiFi device.

So if my device requests to communicate & become a part of your WiFi network, your devices will respond to mine. Our devices will have a tango of protocol dances that are going to create a flood of WiFi Waves intended for me to receive in an attempt to authenticate with your router. I can do reverse latency on them, because you sent them to me and I am monitoring the latency on the return, I can see in your walls, in real time. Right now, the simple product “geo-locates” the signal and can find the source of the signal. A little additional code and it can see you too. A little more and it will be able to see the furniture too!

To use the wave form floating through the air, all I have to do is induce your WiFi to signal and I can see everything wireless in your space through the WiFi. Just like a camera. My device reads your WiFi signal.

The same signal an AI model has now learned how to use to map you in a wave form pattern. A bit simplified in explanation but it gets the point across. I can fly a drone outside your walls and see inside in real time. Makes for some really weird scenarios.

Looking at the level of capability that exists in the technology today is terrifying. AI uses the concept of wave form latency to determine location within a point in time and space. The AI is able to learn, map and observe a space in real time. Programs like Google and Facebook are able to analyze your likes and dislikes better than your spouse. It knows your habits and interests. We are quickly becoming manipulated animals in a digital cage.

We know the tech is available to use WiFi as a camera and AI can discern movement and reasonably identify living beings within the signal strength of the WiFi field. We have demonstrated the ability to interact with your WiFi and “see” inside your home.

Let’s take this to a next step in the journey. How do we reasonably define the fixed terrain features like buildings, roads, forest, sub-terrain features, water and all the rest?

What if I told you that in giant data centers there are databases of fixed geographic terrain and spacial analysis. They are encoded to be read like a wire frame. WiFi is just one of many waveform that are used to identify material density. This is not a new concept.

The use of LiDAR has been around for quite some time. LiDAR when attached to the bottom of an aircraft that flies in a marshaling pattern can “MAP” the grounds surface. As well as everything that sits on that grounds surface. For the most part these surface features do not move. So once scanned they are treated as fixed objects.

Each point in space will be recorded in the data centers with a material density value. When read back it is possible to know if the space is occupied or empty.

So, If I scan an entire area and I record every point in space with a density value and record it. They begin to look like roads, bridges, rivers, buildings, homes, garages, restaurants, shopping centers, office buildings, schools and more. This is where our years of optical recognition of fixed objects has helped mathematically.

Once you scan a large geographic area, like say downtown New York city and then convert your scan into a digital model. That then allows you to digitally recreate the topology of New York city in an interactive digital wire frame and let you walk through it in VR Goggles.

The subterranean features are able to be detected as well with ground penetrating LiDAR.

Again it is because we have a density analysis and read out of the area that we know where fixed surface items are. We also know the precise GPS Location of the density reference point.

So, we now have a complete layout with GPS coordinates and elevations that recreate a 3d model of New York. Pretty cool. It is a stationary, or static model though. It will only reflect that which is detected at the time of scanning. Changes like new bridges or removal of a building would require additional scans to update the model. We will be talking about the scan methods here in a minute.

There is a wire model of your home, your favorite store, your office and everywhere you go. This model is able to be traversed via digital interactions, like your PC or a VR goggle.

There are a lot of possible uses for this. Security is one. Fire fighting another. But when we combine this technological ability to recreate a wire frame of a major city, into our revelation of fixed point routers scanning motion in real time, well, your mind should be racing by now.

Why? Well because your WiFi is not the only wave form signature that is abundant in your world and traceable to you. 5G saturation is another. Wider, broader, more powerful and attached to a central hub all for your entertainment pleasure and value.

Where your phones geography is variable, your WiFi routers geographic location is relatively fixed.

You have unique identifiers in your router that distinguishes your router from mine. So when the time comes to communicate with you, my router can find you and vice versa. Just part of the magic of technology.

So if I know the location of your router, and my AI is talking to it, I can place you within my spacial wire frame once it interprets your WiFi Routers Waveform. No need to be logged in. Just watching the wave not reading your data.

I can instruct the AI to map your WiFi pattern and watch you. Your phone offers communication. Just like all of your smart devices. They are listening. Now we know where to associate those sounds. In your wire frame.

Your smart devices like your phone in particular will assist in the proper determination of elevation and GPS coordinates. The same coordinates that will match up to your homes location on the wire frame. Right down to the room you are in.

There is already talk of a “soft sell” approach to bringing the technology into the public retail ether. It is easier to build this monster if you adopt it. Much like RING, Alexa and a host of other prying, snooping and environmentally aware products are doing. They are tracking you in space and time.

Some of the soft sells are for Grandma. We can enable this tech in your Grandmas home for only $12.95 a month. We can tell if Granny falls. How?

Well, we will be monitoring her avatar with our AI. We will learn her patterns and behaviors. When the AI sees patterns that are off, like excessive sleep. Or, infrequent trips to the bathroom, you can be alerted.

When Granny falls AI will know her body language as such and call the nearest assistance number available. AI will monitor her temperature and hydration levels as well. Granny has no idea she is being treated like a lab rat but it is for her own good – right?

We can offer it up as a home security product to make you feel safer. Or simply as a convenience or entertainment product. This will help to resolve any objections to mass adoption. You will adopt it yourself, willingly.

If I know your place in space and time, I can place you in my digital fixed terrain frame. My AI software will then observe you and billions of others, in real time. I can tell my Vision to observe for patterns on a broad or micro scale too. It watches and learns. It will be able to manipulate your smart devices to manipulate you. Either into retail decisions, dating decisions or any other aspect of your life that the new digital AI EYE can see and interpret. In your best interest of health and the environment of course.

Your assistance in updating your cover photos and other social media pictures will identify you. What you wear. What you look like. My AI can already use this info to build a likeness of you.

What if the AI is told to take those pictures of you in the public ether, and use them to dress out my AI generated real time digital frame of you that is being monitored by AI?

Your new AVATAR. Whether you want it or not. I can sell time on VR by the hour for someone to walk around in any place we have mapped in the world and observe the animal man in real time. Your likeness either approved or estimated. Either way, if you are in a signal you will be traced.

Do you have a digital right to the avatars likeness and data?

After all it is you in an alternate universe, yeah you.

AI already finds public domain info about you and is able to build a deep profile on you.

Imagine, if I was wearing a pair of smart aware glasses as we approach each other on the street. I can see public domain info about you instantly. All of this is capable of being monitored 24 hours a day and no time off for weekends.

Your voice communication associated with your GPS location and phone number are also being analyzed, and recorded. Every persons voice pattern is such that it is unique. My AI can interpret and identify you from your speech. My AI can also apply it to your interactive Avatar that now reflects you in the real world. So when you step forward to cross the road. So does your Avatar in my AI model.

Let’s step back and look at another use for this tech. If I have a wire frame model where I know the elevation of every point in the space scanned. If I use the phones GPS capabilities, I am then able to track an object across the face of the earth in real time and link the GPS coordinate with everything we have talked about. It is possible to know the whereabouts, likeness, speech pattern, walking gait, mannerisms and a whole lot more about a person from the smart tech they wear and possess.

Between the 5g and WiFi, you are being traced. Your “density” will be reflected in AI’s digital space and time. The same density of space you occupy in the 3rd dimension will be reflected in the digital dimension as well.

So what if I wear none of that. What if do not have a WiFi router or a cellular phone? No smart devices at all.

Well – your terrain, or your house, will still be wire mapped because it exists. The space for your home will show as stationary. In other words. When the LiDAR scan’s your house it is able to create a wire frame representation of your home.

If you do not have a GPS Enabled device or internet connected device on the property then what?

Well, without WiFi or 5G to map the house and track your motion, you are just a section of the map that is wire framed, no motion detection because of the lack of wave forms. Or is it?

Well at home, my WiFi is detectable by my neighbor and vice versa. So where there is wave there is detection. If the wave can find you, well, you will show up.

Leave the house and your wave form will quickly take on a wire frame in this new world of skinned out avatars, simply because you are now walking in a “wireless” or waveform environment. Some Avatar are skinned from public knowledge and AI’s knowledge. Others are skinned from pay to play.

If your digital image is seen by your neighbors ring cam and then your wire frame goes into the same frame house every day – guess what. Ring now can reasonably say that wire frame is you. It has watched, recorded, learned and is now able to give you a face whether you want it or not.

So as we sit and read these words. In a digital place of wonderment. On digital devices that lend to the creation of this All seeing EYE of AI.

We have reached a moment in man’s history that we have created a technology that is exponentially increasing our abilities to perform. We have technology that is growing so rapidly that mankind will struggle to adapt. Except they seem to have adapted pretty well in China after a genocidal purge or two.

You can be sure that this level of control is exactly the level that any indiscriminate power hungry human would want. Total control. That is, as long as you can control the controller. The AI.

China to be sure has a very large EYE in the SKY and uses AI to do a lot of what we are discussing. There is nothing from preventing back door code embedded in every wireless enabled device that enters our shores.

Besides Geo-Political adversaries, we have the larger than most Nation State economies companies, like Google & Microsoft that are wanting to build miniature localized, private use nuclear reactors for their massive data centers for AI to live. As well as to store you and I in our digital avatar form.

Things are not right in this world. The potential for dystopia is here. The recognition of a Brave New World should be just that. Not a fall into servitude and surveillance. I will take Freedom.

What harm could come from a whole ring of satellites encircling the earth transmitting communication signals to any potential square inch on earth? Guess you have to ask the people that have the tools.

Today it is our deep state and Military Industrial Complex that we were warned about by President Eisenhower that is at play the worst. Dark Money falls into dark holes and dark work projects are done all on your best interest – of course. The same government that has given all of this help from FEMA – wants to help you too.

Your best interests are in their every thought and action.

Private companies that are no more than paper tiger covers for State Department, CIA, FBI, FEMA, FAA, FCC, you name the agency, they all have civilian owned private contractor’s that operate intelligence and data analysis on the dark budget payroll. They even get to pick who runs em.

See, if it is a “private” company “peeping” on you, the First Amendment, which has been weak in the courts as of late, seems to think they can do what your government cannot. So your government is using your money to fund companies that are doing exactly what we are discussing. Simply because you clicked a button that read, I accept the terms of the agreement.

Where we still do have a resemblance of a firewall they want to maintain it for now. This way they can say there is a certain level of plausible deniability between the company’s and the government.

These companies are scanning daily not only abroad, but here in the US of A. Building this topology in the new Virtual Reality. They have been floating surveillance balloons to build voice communication & movement layers in the “AI EYE” Model, all while updating the addition and deletion of ground terrain features.

Your Social media and voice carriers have sold you out for what they were told was National Security Interests. Now they are just threatened and respond.

There are several scenarios that can play out with what we are about to discuss. Most are very dystopian, simply because man is involved in the creation of them. To think that this has all not been combined, and that it is not operational at a working level, is a fool’s folly.

The Age of The Qubit

https://armageddonprose.substack.com/p/ai-passport-controls-robot-floor