May 11, 2020

How TCP Works - The TCPTrace Graph (Chris Greer)

Troubleshooting a slow file transfer, backup, or any other app that moves a bunch of data in one direction?

How TCP Works - The TCPTrace Graph (Chris Greer)

If so, the tcptrace graph in Wireshark can save a ton of time in tracking down the problem. This awesome graph can show pauses in data transfer, space between transmitted data and the receive window, and SACK blocks from duplicate acknowledgements. In this video, we dive into the TCPTrace graph and show you how to interpret it and analyze it to find problems fast. Check it out!




May 08, 2020

FREE WIRESHARK CLASS - lecture 1 -GETTING STARTED


To celebrate my 10th year on youtube and to thank all those who watch, like, share and subscribe I wanted to give you a gift.

2 years ago I created a Wireshark class for Udemy which was very popular.

So I thought I would release the entire course for all to enjoy.

Please do not bother with the exercises, but find your own way to test the concepts i present.

Enjoy!!!



Zip - It!!!

 

That’s what my mother used to say to me when I was bad. But that’s not what we are talking about here.

However, we will talk about “bad” zipping software. Yes zipping software doesn’t always perform as expected. What a revelation.

Even though zipping of files/data is a routine normal occurrence, those that are conducting security or forensic exams and "save" their results or reports for delivery to management or attorneys should consider the capabilities of these zipping programs with relation to what you are zipping. Some forensic examiners zip image files, which would work fine. Others zip extracted evidence after processing with a forensic suite. And still others massage the initial extracts and get the data down to a point where they will provide it to management or legal persons for review. The problem(s) you should consider, is that during your process, you may inadvertently create some unusual data files. The forensic suite may extract alternate data streams, or extract files and put them in a folder which has a long filename. I have seen forensic suites extract data to long filenames as a matter of course. Especially when you are asking it to recreate the original path/folder structure, and you yourself are outputting the data to a sub-directory which itself starts multi-levels down the tree. You then don't know how long the ultimate extracted folder will be, and when you go to zip up the data for delivery, you may miss one or two important files. Or totally miss an alternate data stream which was hiding behind an important file. Last, but not least, does the zipping program maintain the source and ultimately unzipped file dates? If you have never tested your zipping program to see how/what it does in unusual situations, you probably should. Test not only your version, but the version that the opposition is using.That being said.

First, let me state: the tests I ran are not at all scientific. I consider them practical. Second: I am not using names here because I do not want to point fingers. I’m just pointing out what I found in my unscientific tests. If my minimal tests show a failure, why proceed further. Third: The test suite I used was purely arbitrary, and seeded with items which, from other tests, I knew might cause problems with zipping software, but would not be unheard of in a forensic or backup environment. Fourth: I DID NOT test any encryption capabilities of the software. I use separate PGP encryption of the stand-alone zipped file when necessary. Finally: Run some tests yourself. Don’t take my word for it.

I began by selecting three of the most popular zipping software packages. My versions may not be the most current, because I’m CHEAP and don’t spend money needlessly. Most had both GUI and command line capability. However, for consistency, and because most people prefer the GUI interface, I only used the GUI in my tests. A very important thing to remember with the GUI versions, is that unless all the correct boxes are checked, you may not get the results you expect, and may not obtain results similar to mine. But if the program hid the option so much that I couldn't find it, I may not wish to use that program in the future.

Some required in-depth choices of operations which I would consider required, so I had to look for those options I wished to implement. Each package showed different options for the same operation, and some had no option for a needed capability. I MAY have missed an important option. If I did, it only means the option was so far buried, that a normal person might also forget to look for it, and find it. I tried as best I could to locate and check all the boxes which would cover the items I tested for (ie: Long filenames, Alternate Data Streams, Date retention).

Then I created a folder with the following parameters: Files containing Unicode characters in their filename (ie: CYRILLIC names).

Second, I created some files with long filenames (path/filename > 255 characters). Third, I inserted some Alternate Data Streams (ADS) in a number of the files, both those with normal length names, and long filenames.

I created those three types of files because, in a backup scenario, an investigation and subsequent evidentiary output which would probably be sent to an opposing party (attorney), one or all of these types of data (files) might be necessary to produce. Forensic suites and other forensic software operations may routinely export files with any or all of these items.

Also, I have seen discussions, where persons have been asking which methods people use to store data for posterity. That’s a long time, not your rear end.

The common methods of storing or delivering data are in a zip format. Not only for space saving, but also for inclusion into a single file for distribution to a requesting party.

So, lets test some of the zip(ping) capability of these programs.

Again, I’m not going to name names, or identify which program failed in which area so here are the general results. If you think some of the items may belong to your processes, you might test the software yourself. What a novel idea.

First:  All zipped and unzipped all filenames correctly, (UNICODE, etc).
Second: Two of three processed Long Filenames correctly. (see ADS below)
Third:  Only one processed ADS’s in normal and LFN filenames,  
Fourth: None of them reset the last access date of the original files after the zip process. 
Fifth:  Two of three properly reset (original) last access date of the restored file to the original access data. 
Sixth:  Two of three properly reset all the MAC dates to the restored file. The third only reset the original ‘M’odified date. 

Here is a quick and dirty spreadsheet showing which program did what. If you want to know the real name of the program contact me at: dm at dmares.com

Program #UnicodeLFNADSReset Src AccessReset Dest AccessReset MAC Dest.# 1PASSPASSPASSFAILPASSMAC# 2PASSFAILFAILFAILPASSMAC# 3PASSPASSFAILFAILFAILM# 4PASSPASSPASSPASSPASSMAC

File #1, even though it captured the ADS files in both the normal and Long Filenames, the options to obtain that capture proved to be very confusing. I had to try and create the zip file over 5 times before the ADS's were properly captured. File #2, the GUI interface was nothing less than horrible to work with. So much so, I uninstalled it as soon as the tests were completed. Special note of Program #4, which is WINRAR and is used in both Linux and Windows. It is quite inexpensive (actually I think its shareware, but I paid for a license), when I first tested it, the program did not have the ability to reset the source last access date. However, with one simple request, and what I think was a reasonable evidentiary explanation, the programmers agreed to include the reset of source last access in the next version. Well, as of December 11, 2019 the version 5.80 has all the capabilities which I tested and it has passed all my tests.

I personally prefer the command line, since I have more control. Just take a look at all the available commands and options with the command line version of WinRar (called rar.exe). "Technically" there is a limitation to the length of the path/filename in WinRar. But it is normally not a concern. If you have an exceedingly long path/filename (>2047 characters) I suggest you get to reading war and peace. (just a joke). The 2047 limit should be enough for most instances. In the next section I have provided a command line that seems to work very well at creating a self extracting exe which passes all my tests. You may want to check it out.

"c:\path_to_winrar\rar"  a -sfx   -r -ts+ -tsp  -os _DEMO.EXE  -zc:\"program files"\winrar\comment.txt   folders/files-to-ad  -ppassword

The content of the comment.txt file which contains routine required options is:
The comment below contains SFX script commands which will cause the extraction of the .exe to be silent (not ask for user input) and
overwrite any existing files during the extraction. The other item which begin with a semi-colon are unnecessary and are included 
for other purposes not needed at this time.


Silent=2                                                
Overwrite=1                                             
;Setup=setup.exe                                         
;SETUP=setup16.exe          not needed    
;Presetup=hello.exe                                      
;Path=C:\temp\default_unzip_path                         
;PATH=.\.      
;SavePath                                                


the subsequent extraction/unzip command line (which is easily included in a batch file) is
C:DEMO.exe -s2 -tsp -tp+ -os -ppassword 

Even though the above process seems to work and passes all my forensic, evidentiary preservation requirements, this mention is in NO WAY and endorsement of WINRAR. Don't take my word for it, and test for yourself any zipping program you use and be comfortable with its operation. I have tested and use WINRAR when preparing all my test data and it has worked admirably.

Consider any or all of the above shortcomings when you are archiving, or preparing for discovery your files.

In short, none of the zip programs tested in this minimal test process passed all the tests. The tests included: Unicode File Name retention, LFN, ADS, reset ALL appropriate MAC dates (of source and restored files).

AND: When you actually think about it, isn't a "zipping" process a sort of copy method for retention, discovery, safe data saving? What evidence might you be missing in the zip process? Also, is next years version of the zipping program going to be able to unzip last years version. Or is product 'A' capable of processing the zip file of product 'B'.

A final thought, but not included in the above list. I tested a recent "free" version of PGP (v8). It compresses, and lo and behold, also had failures. However, since i don't use PGP to compress, only encrypt, I didn't include it in the statistics.

So, which zipping program are you using to store and restore your legacy data or evidentiary file data?

Associated articles and programs of interest: hash  program to calculate hash values. HASH_IT_OUT  an article discussing forensic hashing of evidence. COPY_THAT  an article discussing forensic copying of evidence. ZIP_IT_TAKE2  an article explaining the testing of zipping software.

Test data  containing about 30 files, in a self extracting executable

May 05, 2020

7 Emerging Trends In Cloud Technology For 2020

  With each passing year, cloud computing is slowly becoming synonymous with computing as such.

7 Emerging Trends In Cloud Technology For 2020

Calculations that involve massive amounts of data are simply not solvable on a single computer, no matter how powerful. Instead, the task has to be distributed over a network of machines that comprise the modern computing cloud.

Understanding the current state of cloud computing is vital for business owners. By leveraging cloud technology businesses can keep running even in situations of economic disruption, such as the one we are experiencing due to the Covid-19 pandemic.

To bring you up to speed on where cloud computing is heading in 2020, we decided to cover the latest trends in the cloud industry. You can find our coverage in the paragraphs below.


1. The Rise of Containers

A container is a virtual software environment for running applications. Containers are the backbone of many cloud computing systems. Containers offer an effective way to distribute computing resources. Cloud systems that use containers are also easy to scale through the addition of additional containers.

According to 451 Research, the container market is set to achieve market worth of $2.7 billion this year. A survey by Cloud Foundry further confirms this trend, stating that 53% of surveyed companies are in the process of integrating containers in their tech stack.

Kubernetes is by far the most popular container solution on the market. What initially started as a Google project in 2014 grew to become the leading platform for container orchestration. Kubernetes is an example of Infrastructure-as-a-service (IaaS).


2. Serverless Computing

The basic building block of a computer network is the server. And since running a businesses without a computer network is impossible, companies had no other choice to rent or purchase servers to host their data and applications.

The problem with servers is that you can rarely tell in advance how much power and space you will need, and for how long. You could end up spending a hefty sum upfront, only to find out you’re using 50% of its computing power.

Enter serverless computing. This new cloud computing service model eschews paying for servers on per unit basis. Instead, serverless computing providers offer a pay-as-you-go model for renting servers. Serverless computing is another trend towards increasing decentralization of IT infrastructure.


3. Hybrid Clouds

As cloud computing proliferates, we will see more and more enterprises starting to develop cloud infrastructure for internal use. Such clouds can be built on top of existing networks and resources. While setting up a private cloud can be a complex endeavor the benefits in terms of scaling, flexibility, and data safety make it a worthwhile choice.

Some companies are taking things a step further and adopting a hybrid cloud approach. In addition to using internal cloud networks, these companies are utilizing public cloud infrastructure as well. Such hybrid solutions allow companies to fully harness the power of the cloud.

Hybrid clouds are a sign that companies are looking for ways to prevent vendor lock-in. Relying on a single cloud provider is akin to putting all your eggs in a single basket, and hybrid clouds represent a way to avoid this.


4. Edge Computing

The main advantage of cloud computing over centralized computer networks is its distributed nature. Instead of relying on a single hub for computation, cloud networks are divided into nodes with a certain geographic distribution. The edges of such networks are used as an entry point for users wanting to interact with the cloud. This is edge computing.

Cloud providers are increasingly working on developing edge computing in an effort to give users data and computing power through a low-latency connection. Cloud devices located on the edge have their own computing, storage, and network modules. These devices operate as the gathering spot for processing information from other parts of the network. This information is then sent to the nearest data center based on pre-defined protocols.


5. Cloud for Mobile

The transition to a mobile-first environment has affected cloud computing as well. Mobile cloud computing is a model for developing applications for portable devices such as smartphones and tablets.

According to this model, on the user side of the application there is a light-weight client interface used for presenting data and querying inputs. As for the actual computation, it is performed entirely on the cloud. This allows for the development of feature-rich mobile apps that can run on any kind of device thanks to the cloud.

While mobile cloud computing is on the rise, there are still some pending issues. One of them is security, as mobile devices are notoriously prone to hacking attacks.


6. Cloud AI

The cloud is a natural environment for artificial intelligence. AIs hosted on the cloud gain direct access to massive amounts of data, which enables them to optimize their core competencies via machine learning.

AI is also being used to solve cloud-related problems. From determining trends in power usage in server clusters, to finding patterns of network failure, AIs are essential for the continued development of cloud infrastructure.

Conversely, the distributed nature of cloud computing gives AI the ability to manage its resources more effectively, allowing for faster computation, and therefore faster learning.


7. Cyber-security Within the Cloud

Ever since the Capital One data breach last year there is growing interest and concern for cloud security. The case of Capital One has shown that when cloud credential management systems go haywire, disastrous consequences soon follow. This has prompted cyber security vendors to focus their efforts on cloud security.

A cloud is a system with many moving parts, which makes implementing security measures more difficult than in the case of conventional computer networks. The cloud has numerous points of entry by design, which unfortunately makes it more vulnerable to attacks from hackers.

Cloud providers such as Amazon are increasingly using AI to automate their security efforts. They are also simplifying cloud access interfaces to limit potential venues of attack.

The Future of The Cloud

Over the coming year, we can expect cloud computing to evolve further, eventually becoming so ubiquitous that it becomes synonymous with computing as such. Once cloud vendors solve problems with infrastructure, we can expect the cloud to become a commodity like electricity or internet access. Once we reach this stage, we can expect cloud services to proliferate, bringing the cloud closer to businesses and consumers alike.


Author - Angelina Harper - Angelina is a tech writer and contributor to reputable digital marketing websites.

She is interested in cloud technology, web design and social media marketing.

If you would like to read more articles from her, follow her on her Linkedin profile.

Popular post in the past 30 days