January 13, 2020

A Random Walk(*) with Artificial Intelligence (by Paul W. Smith)

 

A Random Walk(*) with Artificial Intelligence (by Paul W. Smith)

My wife and I recently joined some of the last homeowners to contribute to the $11 billion robot vacuum industry. Since I am usually the one who does the vacuuming, I liked the idea of completing this chore with the simple push of a button. It was an exciting day when I took our new Roomba out of the box, placed it on it’s little charging dock (which we located in the laundry room for our convenience), and then stood back and pressed “Clean” on my iPhone app.

Our Roomba (which we named “MaxBot”) banged up against the dryer a few times, then headed off down the hallway toward the living room. Our 13-yr old Shih Tzu soon began barking frantically, having failed to grasp the utility of this intruder. At his age, the only thing he does frantically anymore is eat; it was clear that a black hockey puck bigger than he is should not be scooting around the house unattended – at least not on his watch. Lesson #1: AI robots are not for everyone.

According to the promotional literature, robot vacuums use Artificial Intelligence to map out and store the boundaries of your home, calculating the most efficient way to cover every square foot. They are also supposed to anticipate their own demise, returning to the home base and recharging when needed.

On the morning after the first trial run, I awoke expecting to find a bunch of neatly vacuumed rows, like the freshly mowed patterns on a major league baseball field. I assumed the robot would be back in the laundry room, with full batteries and eagerly awaiting its next cleaning assignment. Instead, MaxBot was cowering guiltily under the dining room table and the living room looked as if it had been run over by a drunk monkey. While I was peacefully dreaming of things unrelated to household chores, MaxBot had texted a desperate message to my phone, pleading for a charge. The laundry room base station was the most convenient for us, but not for MaxBot which was unable to find its way back there. Lesson #2: AI will require some accommodations.

The naturally intelligent folks at Stanford have studied the impact of emerging AI developments on robotics, predicting that by the end of the next decade, domestic robots like ours will be much more common. They noted that current robot vacuums don’t do stairs, while the majority of homes have one or more of them (MaxBot will tip over about 30 degrees at our top stair before altering course in search of level ground). Those same folks who tape over the little camera on their laptop computer have also expressed concern that robotic vacuums contain valuable data about the size and floor-plan of our rooms, coupled with the geo-location of the home. Stanford NI believes these, and other problems, will eventually be overcome.

One of the most puzzling aspects of AI in many of its applications is that it can be hard for even the developers to understand what it’s up to. Algorithms based on self-learning, neural networks, or deep learning can make AI smarter without revealing exactly how it got that way. One example from the medical field was recently reported in New Scientist.

If your cardiologist diagnoses you with a serious incurable heart problem, you would understandably want to know just how much time you have left. Doctors expect this question and will make a reasonable and compassionate effort to answer it. Meanwhile, scientists in a Pennsylvania healthcare group have been evaluating an AI solution.

In order to avoid the inevitable ethical questions, their AI test system was given electrocardiogram data for patients whose date of demise was already known. The researchers not only found that the AI was much more accurate than the cardiologists in its predictions, it also forecast the risk of death in patients previously classified as having a normal ECG. Although this new AI technology can tell you when you will die, no one is quite sure how it knows.

All of which leads to the most important lesson of all, Lesson #3: Don’t expect too much from technology, especially one that cleans like a drunk monkey.

(*) The movements of an object or changes in a variable that follow no discernible pattern or trend.

Author Profile - Paul W. Smith - leader, educator, technologist, writer - has a lifelong interest in the countless ways that technology changes the course of our journey through life. In addition to being a regular contributor to NetworkDataPedia, he maintains the website Technology for the Journey and occasionally writes for Blogcritics. Paul has over 40 years of experience in research and advanced development for companies ranging from small startups to industry leaders. His other passion is teaching - he is a former Adjunct Professor of Mechanical Engineering at the Colorado School of Mines. Paul holds a doctorate in Applied Mechanics from the California Institute of Technology, as well as Bachelor’s and Master’s Degrees in Mechanical Engineering from the University of California, Santa Barbara.




January 07, 2020

Using Wireshark to Find the HTTP Login Decode

 

Using Wireshark to Find the HTTP Login Decode

In past articles I covered how to search for HTTP login credentials.

After some feedback, I wanted to cover another approach used to find login credentials.

Just want to start with a simple statement.
These articles are used when troubleshooting, baselining or for protocol analysis practice. I always tell my clients that if you don’t like having your passwords in easily decoded or clear text format you can either change the application, or use https or other techniques to protect your cleartext data.

The approach in this example is for those web applications that use a HTML form for login/authentication.

Enjoy







December 19, 2019

Packet Capture vs Accurate Packet Capture (Chris Greer)

I just wanted to take a few minutes to share the results of some of the "Capture Limit" testing I have been doing in my lab. These results were shared at Sharkfest Europe 2019 in Estoril, Portugal. The purpose of the session was to discuss the considerations of building your own capture appliance. I am not trying to promote any specific product; rather my goal is to demonstrate the limits where the accuracy of a capture on a laptop becomes questionable.

During my performance testing, I found that there was a huge difference between capturing everything (no packet loss) and capturing everything correctly (packet timing is accurate). Before getting into the results of the testing, let me tell you a bit about my setup.

My line-rate 1Gbps traffic generation box sent traffic to a machine that was benign IP (Protocol ID 99). The connection was tapped twice - one feed sent to my MacBook Pro off a network tap, and the other sent through a ProfiShark device to another capture point. The ProfiShark does the capture on the device itself, while the network tap just forwards the traffic to the capture device where packets are collected and timestamped.

The traffic stream was sent as either small packets (100 bytes), medium sized packets (512 bytes) or large packets (1518 bytes). My traffic generator could only do one packet size per test, so I ran it a few times to see the differences. I gradually reduced the throughput rate until capture point A would (a) keep up with the ingress rate, and (b) accurately timestamp the packets.

Here were the results, with 100,000 packets generated per test:

Let's examine these results.

In the test, 100,000 packets were sent to the target with varying packet sizes and throughput rates. Notice that in capture point A, I was only able to capture all the packets when the rate was turned down to about 250Mbps. Even then, there was a ton of false jitter in the packets. The inter-packet delta times were all over the place, with a maximum value of around 20 milliseconds. This is pretty bad considering that the deltas should have been no higher than a few microseconds.

The second thing to note is that the timestamping was off on Capture Point A until the rate was backed down to about 10Mbps. At this point, the delta times smoothed out and the capture device was able to keep up with the ingress traffic, timestamping it appropriately.

These tests were run both in the Wireshark GUI and on the command line with dumpcap. The results were only slightly better with dumpcap.

All the while, the hardware-backed appliance was able to keep up with line-rate 1Gbps, with correct timestamping.

Conclusion

If I am going to capture a packet stream that is any higher than 10Mbps throughput, it's best to do it with my ProfiShark or another purpose-built appliance. Capturing all the packets is not enough for me - I also need them to be time stamped correctly. Hence the difference between packet capture, and accurate packet capture!

Got questions? Let's get in touch!





Popular post in the past 30 days