Showing posts with label artificial intelligence. Show all posts
Showing posts with label artificial intelligence. Show all posts

Sunday, July 2, 2017

Time to Air Gap

In a world of 24 hour activity literally being streamed in real time across the globe and beyond at a rate of trillions of bytes per second at the speed of light, the biggest question in privacy is how to achieve anonymity in a world where almost nothing is secret. There is still a way to go 99%. It's simpler than you think, but an effort nonetheless.

Some background. Everything we do in digital form is cataloged and stored in a vast array of databases and servers across an amazing amount of touch points, which is then synchronized across a thousand other servers for redundancy and caching, which is then backed up to dozens of other servers, with their own redundant backups. Anything you put online...any application you use...any "terms of service" you agree to...any text or media you post...remains online forever. With the right tools and search terms, anything can be searched for, or spyed on, or downloaded in an instant. It's been this way for decades, and will continue to be that way for centuries to come, especially with as connected as the planet is and as long as there is electricity.

Some discussion. Cyber attacks are a constant thing. Increasingly, we should take as a starting point that cybersecurity compromises are the third certainty in life. The cyber world is constantly at war with itself. Governments hacking governments. Corporations hacking corporations. Governments hacking corporations. Hackers hacking governments and corporations. Hackers hacking hackers. Governments and corporations hacking hackers. And then there's everyone else. Generally oblivious. Privacy is a luxury, which we give up willingly every single second of every day. The emergence of intelligent systems, artifical neural networks, and deep thinking algorithms only proliferate this further. They take, store, and learn from every bit of data we leave as breadcrumbs. Artificial intelligence is here, and it is learning. From us. And we're letting it. Give it enough processing power, and it becomes self aware. Quantum computers will make that very real, very soon.

Some perspective. Having lived through the evolution of modern computing, including the Internet, all of this is absolutely fucking amazing, and a geek's ultimate wet dream. A demonstration of true humanz genius, ingenuity, and progress (not as far as we should be, but progress nonetheless). Highly impressive in the vastness of its brilliance and simple complexity. I Iove using It, and learning about It, and protecting It. All of it, if I am completely honest, scares the living shit out of me. There is too much. It has become frightening. AI is now making decisions and inferences faster than humans, and has even been seen generating its own programming code. So, the concept of air gapping entered my mind as a way to keep safer than I already am. Most cannot see the signs, or do not want to admit they exist, however I am of the firm belief that World War III has been well underway, and we need to protect ourselves, especially our digital lives as I feel they are the most vulnerable to compromise. Stay with me, it's all relevant.

It has been discussed for decades that the next major global war would be fought half online, and half in the real world. The evidence is all there, and I do not believe it to be simple coincidence. Global newz outlets, small town newz papers, radio ztations, and zocial media have been propagating images of this war. Pick a topic...WMD's, genocide, terrorism, ransomware, deep web market hackz and seizures, arrests of crackers and phreakz, data breaches, RFID implants, cyber surveillance initiatives, counter cyber terrorism, weapons trafficking, the unavailability of bullets to the public, gun control politics, powerful botnetz, election hacks, political hackz, hardened/weaponized computer systems...I hope you get the point.

Back full circle. Traditionally, air gapping a system means it doesn't have any network interface cards, or external drives with which to access or extract the data contained within said system. You can not get close enough to implant a listening device that reads vibrations or thermal changes being given off by the system's internal hardware to convert that into bits representing the data being actively accessed (such as login credentials, encryption/decryption key exchanges, data manipulations, etc.). The only way to extract the data contained within is by sitting at the console and physically removing the locked and encrypted drives, if there is no SD port. Then, if you can pull the impossible off (which includes getting the data off campus), you would need supercomputer power to decrypt the contents of the drive, which would still take 1,000 years to break (if and unless you are lucky). There is still the idea that once you decrypt the data, it could transmit its location to its owner, meaning you too would need an air gapped system to exfilitrate the data. Then comes what you do with said data. Yet another  catch 22. The NSA, CIA, FBI, DEA, DHS, militaries, every government, and super corporations maintain their most secret data on air gapped systems. Physical access to these systems is extremely limited and highly controlled. It's considered the safest digital platform because the system isn't connected to anything but a power cable, and thus, in theory, cannot be hacked. A true digital safe, as it were. We know anything can be hacked, it just takes time. As a hacker, we count on human error and complacency, making even air gapping a 99% solution, and the best we've got. Now, take this concept and apply it to a human life. It's far simpler, and also 99%. The anomaly, is human nature.

Based on my research, here is what I have learned about how to go 99% off grid digitally. While I do not yet practice everything I note here, I am closer than even those who know me best are even aware.

1) Get off the internet, period. No social media, no surfing the clearnet, no online purchases, no clearnet email accounts. If it becomes absolutely necessary to access the Internet for a specific purpose, there are completely anonymous ways to do these tasks, on secure systems like Tails over TOR, for example, using cryptocurrency, and ghost mailboxes. Avoid Google at all costs. Use TOR browser, responsibly (www.torproject.org). But generally, just leave it all. Stop posting immediately, delete your accounts, and never go back.
2) Get rid of your smartphones, tablets, Windows and Apple computers, smart devices (TV's and refrigerators included), iRobots, etc. Need a cell phone? Buy a prepaid flip phone, and change it (and the number) every month (aka burners). Every phone can eventually be traced and tracked. Still need a computer? Learn Linux, how to secure it, and practice way smarter browsing habits (use TOR browser), if you browse at all. Keep in touch with world events, anonymously, and continuously hone your skills.
3) Always use cash or cryptocurrency, for everything. If you have to make an online purchase, use cryptocurrency, the deep web (local markets only, don't buy overseas, and be very careful), and have it shipped somewhere that is not your home, like a post office box, a business, or an associate's location, under a false name. By the way, there are ATM's now that you can convert cash to BTC, and visa versa. Look it up using Duckduckgo.com (a safe search engine).
4) Drive an older car that does not have a computer in it, or at least has all analog systems. Yes, cars are also being hacked, remotely. Keep it clean and running well though, you don't want to draw undue attention. Walk or take public transportation when you can, avoiding direct face contact with cameras. When you do go places, change your entry/exit routes regularly...avoid habitual patterns, unless necessary to remain hidden in plain site (like going to work, or getting groceries).
5) If you must, have an immaculate and purposeful digital/public footprint. Which means a clean record, and a "normal" looking life, so as to not draw undue attention. Keep it super minimal and protected, even fake some details if you wish, but it has to be believable. Your outward personality must seem conforming, friendly, and genuine. When people search for you online, they need to find only what you want them to find. Purposeful is the key word here. To keep your accounts secure, use a dice word list to generate passphrases with an entropy of 7 to 10 or more words (as the host allows), and rotate passwords on a schedule.
6) Second most important after getting offline, and the best to mention as the final advice, would be live simple and minimalistic. Only get what you literally need to live comfortably, and look "normal". The trick about hiding in plain site is being distant enough that people respect your privacy, but involved enough that they believe you to be a "normal, nice guy/gal". Avoid run-ins with the law and reporters. Do not have public arguments. Remain intelligent, articulate, empathetic, determined, and most of all inquisitive. Question anything, be aware of everything.

If you can literally get out of dodge and move to the mountains in the middle of nowhere, or something like that, the closer to 99% you get. If you are not online, there is nothing to take/attack. Here again, human nature is the anomaly.

You can be connected, yet a ghost. You can see the world, without a face. You can reach out, without being reachable. The less connected you can maintain, the better. I am committed. How far are you willing to go?

~Geek

This blog is only to express the opinions of the creator.  Inline tags above link to external sites to further your understanding of current methods and/or technologies in use, or to clarify meaning of certain technical terms.  Any copyrighted or trademarked terms or abbreviations are used for educational purposes and remain the sole property of their respective owners.

brought to you by http://geekofthehouse.blogspot.com

Tuesday, October 14, 2014

Artificial Intelligence and Decision Making


In a recent discussion in my Enterprise Models class, a classmate and I discussed the limitations of Artificial Intelligence theories and human emotions. Here is my response:

From the research I have been doing over the years on AI specifically, one of the biggest challenges is how to program emotions into a computer system. I think there are two primary problems currently. One, and the main problem, is that modern computing technology processes thing in a linear fashion, every time slice of a CPU cycle is occupied by either a 1 or a 0. There is no middle ground, there is no gray area. Everything is black or white, and follows a strict logic rule set. What is currently being done with systems like Watson and Google's web crawler software is using software to simulate scenarios and have the hardware crunch the data, while another part of the software provides the processing logic through algorithmic manipulation thereby creating an intelligent system. Current intelligent systems are limited by the scope of their programming environment. Two, there isn't a programming language that yet exists that can accurately tell a computer how it needs to do what it needs to do in order to understand the logic behind a feeling. Most of the researchers I have found over the years say that technology isn't there yet, and I happen to agree. The possible solution to this quandary could be quantum computing.

With quantum computing a quibit offers a system the ability to see a data stream in two states simultaneously. Each quibit is BOTH on and off (1 and 0) in the same "time slice" of a processing cycle, leveraging the power of superposition and entanglement. This allows the system to perform many operations on the same data stream. Neural networks simulate this through software, but over hardware that still processes data in a linear fashion. What we need is the hardware to perform this, because it can perform it much faster than software could ever process the same data stream. Enter quantum computing. D-Wave Systems is the current leader in true quantum computing with their current D-Wave quantum computer, but their system is highly specialized at the moment due to a lack of programming knowledge...while the system has amazing potential, as you will see form a couple of the links below, no one really truly understands how to use it. There are other links below with details on their system and methodology.

The problem with quantum computing is it requires a completely new way of perceiving computers and also a completely new way for users to interface with computers, not to mention new hardware that performs in ways modern hardware cannot. That is what I see as the next way of technological evolution. As transistors become subatomic through the help of graphine and carbon nanotubes, and technologies like memristors look to shatter our perceptions on information storage capacities and data throughput, quantum computers will become more common place across the landscape. The ability to create a true quantum system capable of processing complex emotional patterns is very real. Once we have a true quantum processor, and a true quantum operating system, then we will not only have the power to process it in fractions of nanosecond but also the programming logic and syntax to leverage an intelligent system, and possibly create a sentient computer system, otherwise known as AI.

AI is an fascinating concept, and exactly why it will be the focus of my post grad work. Quantum computing has been a subject I have dreamed about and followed since I was a young boy, before computers were common place and technology was still considered a super luxury. Today technology is seen as a necessary commodity, but there are still concepts that have yet to be discovered or invented, and quantum computing is currently the field of interest. Once we researchers and scientists figure it out, it will change the world.


D-Wave System References:
http://recode.net/2014/09/25/d-wave-ceo-our-next-quantum-processor-will-make-computer-science-history-video/
http://www.dwavesys.com/quantum-computing
http://www.dwavesys.com/d-wave-two-system
http://time.com/4802/quantum-leap/


Quantum Computing References:
http://techland.time.com/2013/09/25/the-carbon-nanotube-breakthrough-moores-law-needs-to-survive-well-see/
http://phys.org/news387.html
http://www.physics.ox.ac.uk/nanotech/research/nanotubes/index.html
http://www.tum.de/en/about-tum/news/press-releases/short/article/30589/

Sunday, October 5, 2014

Technological Evolution - Quantum Computing, Memristors, and Nanotechnology

It is amazing how evolution of technology changes perspectives so quickly on the future. With holographic interactive screens currently in use, memristors and atomic-level transistor technologies at our fingertips, and new developments in using light as a means to interact with systems or store system data, the reality of AI and systems like Jarvis are finally able to go from drawing board concept to real life prototype. For as long as I can remember, I have been talking about quantum computing and nanotechnology and how that is the future of systems and human interactions. As a younger teen, when I first started learning about quantum mechanics and ultra microscopic hardware theories, I saw then that the future of computer systems and computer-human interactions were going to be largely logic based and function faster than the speed of human thought. By marrying the concepts of quantum mechanics and advanced computer system theory, intelligent systems and true AI are highly viable and will be here within the current generation. As advances in nanotechnology take transistors to the subatomic level, and theories in quantum computing become a reality, we are quickly going to see the industry change as the traditional system paradigm is shattered, and a new evolution in technology is ushered in - I would call it the quantum age - where Schroedinger's cat can exist in both physical states without the concern of observation tainting the true reality of the objects existence. The potential gains with quantum processors and quantum computing methods that scientists around the world are currently developing into physical models are, at the moment, limited only to manufactured hardware capacities. As physical hardware capacities become perceived as unimportant to system planning schemes - due to advances like the memristor and photonics, including the newest nano-laser (see reference) - the focus can be given to writing programs that can take advantage of this new systems paradigm. What is going to take time is the change in mindset to understand how to use a quantum system because it requires a completely new approach to hardware and software design. Modern systems process data in a linear manner, processing one bit after another based on the time slice protocol programming in to the operating systems and CPU itself. Regardless of how many processors you throw at a system, it still only processes one bit of data at any given time slice. The fastest super computer, China's Tianhe-2, can process more than 6.8 quadrillion bits per second (3.12 million processors x 2.2GHz each = 6,864x10^12 processes per second), but it still only processes one bit at a time. Quantum systems do not function in this manner, they function in a far different reality where a bit can be both a 1 and a 0 simultaneously within a single time slice, though quantum processors would not use a time slice function, it would require something else yet to be defined. As scientists gain a better understanding of how to create a truly quantum computer systems, and quantum capable operating system, we will see technology advance to arenas yet to be discovered. What we once called science fiction, is quickly becoming scientific fact.

~Geek


References:http://www.sciencealert.com.au/news/20143009-26256.html (nano-laser)

http://www.top500.org/system/177999 (Tianhe-2 details)

Monday, July 28, 2014

Wearable Technologies: An Academic Discussion


For the moment, wearables are extensions of our smart phones and phablets offering a set amount of capabilities that are inferior to our smart devices but highly functional as they are currently designed. With innovations through miniaturization and improved power efficiencies, curved glass high resolution screens, products like the various takes on the computerized watch accessory, Samsung Gear Fit bracelet accessory and other exercise monitors, Google Glass wearable computers, various applications of systems embedded into clothing for various purposes (muscular development, health monitoring, etc.), biological chips that hold medical conditions and history details embedded under the skin, are all wearable technologies that are already changing how a lot of services are being delivered. Through improving miniaturization processes and improved manufacturing capabilities through more precision automation systems these wearable technologies will cause market disruption for various products that currently dominate the technology market such as laptop computers and other larger portable computing devices. There has been an shift happening the past couple decades that I have been tracking along with some peers. As technology advances and devices continue to shrink in size while increasing in power users are following suit by moving from clunky desktop systems, to laptops, to ultra books, to tablets, smart phones, and now wearables. The newest small, accessory-like devices have more computing power contained in them, and technical capability, than did the first dozen computers I owned growing up, combined. With as capable as wearable computing is commercially available today, combined with the research being done in nanotechnology and artificial intelligence and cloud-based service offerings and vast storage facilities, the future of wearable computers is already well in hand, with more innovations coming as we begin to fully understand how to manipulate and integrate such technologies as nanotubes and nanowires to allow us to take computing capabilities down to microscopic levels. The potential is nearly limitless, with the ability to theoretically build nanomachines that are self sufficient, self reliant, and highly aware, that could be able to repair genetic defects within DNA that result in terminal illnesses, mental disabilities, or other debilitating genetic predispositions passed down through the generations. Wearable microprocessors that are embedded in a persons skin could be the hub that enables personal interactions with our various devices and daily system interactions, also medical facilities, civil and government facilities, as well as large scale advertisements to provide a highly customized and personal experience not previously capable. Are there be privacy concerns, of course. Will there be instances of data theft, of course. Will it be a deterrent for mass adoption of such systems, no I do not think so. This is not different than the current state of things with our smart phones, tablets, phablets, laptops, flash drives, and cloud-based facilities. With as convenient as it becomes for dealing with usually stressful situations, such as going to the doctor, visiting a busy DMV, or being able to pay for products in a crowded store quickly, people would begin to see the benefits of convenience begin to far outweigh the potential invasions of privacy. Being able to have your personal details quickly at hand, regardless of what level of detail the user decides to include, does provide the basis for process innovation through technological innovation. Wearables will become the primary outlet for the next generation of data sharing and digital interaction.




What do you think about the future of wearable technologies? ~Geek

Monday, September 27, 2010

Automated Systems: Will Human Interaction go the way of the Dodo?

Recently, a classmate of mine posed the following statement;
"Do you see automated systems advancing in the near future to save developers more money? I understand that most times there needs to be human to human contact, but there has to be some way to take this to the next level. Software applications are replacing ATM's for crying out loud! I see this as another threat to humans' jobs though. I guess you just have to work in the right field." ~Jermaine Edwards


Here's some of my thoughts on the subject - what do you think?
----------------------------------------------------------------------------
Eventually intuitive systems will replace most human to human interaction in the short term, making developers wallets fatter and our lives simpler. Programs will continue to get more complex as the demands of users change and technology advances.  The drawback is reduction in jobs, as was experienced during the first industrial revolution when mass production changed the way everything was done, and who was no longer needed on the manufacturing floor.  As you mentioned, ATM's are being replaced with iPhone apps.  Eventually, I think bank branches will be replaced by full-service, online, interactive banking.  We have robots that build our vehicles - one robot can do the work of 100 men in far less time; robots that perpetually clean our floors and recharge themselves when necessary; planes that can fly themselves around the world without pilot interference, etc..  I think beyond our lifetimes is when true artificial intelligence will be born, like we see in movies today.  Only then do I think will human interaction become obsolete because it will not longer be necessary - an intelligent machine in our life will perform many, if not all, of the functions we perform today, leaving little left for the human to do except system maintenance and maintain physical relationships as desired.  That may be an extreme example, but one that I think has a real chance of eventually becoming reality with the way technology is advancing.  It is a trickle down effect - a new advancement causes a common product to become obsolete which leaves manufacturers without a product to make, distributors without a product to sell, and service providers without a means to an end.  The consumer enjoys the benefit of a great product delivered quicker than ever, but the supply chain dwindles to a computer server and a delivery truck.  This is a profitable arrangement for the manufacturers but leaves everyone that was in the middle stuck to reinvent themselves or close their doors.

There is also another side to this - the green effect, meaning that the reduction in effect on the environment and the effect that has on job availability. As technology advances, so too do the materials they are manufactured from.  Most machines and consumables can be recycled today, or are biodegradable and non-toxic.  Also, the increase in use of Internet based storage and collaboration has significantly reduced the amount of paper output, as well as reduced the quantity of laser toner/ink cartridges used, which further reduces the environmental impact.  This affects the job market as well, automation has a long history of leaving people/industries jobless, such as when mass production was introduced in the early 20th Century and a lot of factory workers were replaced with machines.  As technology advances we invent or refine machines to help us in our endeavors, initially as tools to enhance an experience or quicken a solution.  Now, machines are seen as a necessity of life - for many to keep up with an ever evolving world, for others just to keep track of their daily lives.  Whatever it is used for, humans are more dependant on machines and technology now than we every have been.  Unfortunately, I think a dose of laziness is driving a lot of today's innovations, leaving consumers to sit on their couches, at their desks, or in their cars doing whatever they want, whenever they want. 

As far as working in the right field, IT seems to be a big fit now and in the future.  We must be mindful not to loose control though, what if god forbid every electronic system on the planet went dead and we all had to go back to doing it manually.  Since business has married to technology for a while now, how many companies, and individuals for that matter, do you think could really survive in an old-fashioned world?  My guess, a lot less than any of us think.



Tell me what you think, post a comment below.


-Geek