Tuesday, June 23, 2020

KAPE at Scale

After reading @Carlos_Cajigas post and a personal demonstration (thanks for that), it got me thinking. What if we didn't have to download KAPE to the system to run? Something along the lines of Sysinternals Live. Further more, what if after the collection was done, the remote server automatically ran the modules you want? And while we are at it, why not send an email when everything is done processing so we don't have to periodically check to see if everything is done. This would eliminate downloading tools to the endpoint, processing the artifacts on the endpoint, and waiting around for everything to finish. With that goal in mind, that is what I created.

The Setup:

We are going to need two things to set this up:
1) KAPE
 KAPE can be downloaded from here
2) A web-server
 This can be either a local or cloud based server.

The first thing we want to do is download and setup KAPE on the web-server. I am not going to go into detail on setting KAPE up. There is plenty of documentation out there. Once KAPE is ready, we need to make an SFTP configuration for KAPE, this is how we will send the collection back to the server. At a minimum, the SFTP account is going to need upload and delete access. More information can be found here on setting up SFTP for KAPE.

Example configuration file.

We can now create a scheduled task to run KAPE in SFTP mode when ever the server starts.









Next we can setup the web-server. I created a script to do all the heavy lifting so you don't have to. It can be found here. What this script does is sets up a web-server with WebDAV enabled, creates the accounts needed to access the site and creates a WMI subscription for the automation. *Please note, this is not production ready. You will need to secure things better than what the script does.

Web-server setup:

Here's a breakdown of what the script is doing:

The first thing we need to know is the install location of KAPE and the drive we want to monitor for incoming collections.



Next up, email parameters. *Note: the email password will be encrypted with the system account.


The script will then install the needed features for the web-server. WebDAV is enabled so we can mount KAPE remotely as a file share. This way, there is no need to download KAPE to the endpoint. After that, we need to setup the user and group that will access the site. This group has read only access so nothing can be written back to the KAPE folder when mounted.


After that, the script will finish configuring WebDAV, change the WMI Provider Host Quota Configuration and setup the WMI subscription. There are a couple of  reasons I went with a WMI subscription. There isn't a script laying around to accidentally get deleted and this also runs KAPE under the system account. Once done, the system will need to reboot.


Ready for action:

With setup complete, we can now test everything out. On the machine you want to collect from, check and see if you can get to the web-site. You should see the directory for your KAPE instance.


Now that we know we can reach the site, lets mount it as a network share.



After that, we can run KAPE. For the automation piece we will need to use the KAPE_automation module. The module takes two variables: module and mvar. The module variable is the modules you want KAPE to run on the collection. Just like KAPE itself, this is a comma separated list. The mvar variable takes a key:value pair but instead of using ^ as a separator, it uses ◙ (Alt+10) for the separator. See the example in the module.



Let's try it out. The following command will collect the registry hives, $MFT, and Symantec AV logs. They will be sent to the server via SFTP, mount the vhdx, and parse the Symantec logs and create a time line with the date range of  06/19/2020-06/12/2020. Once complete, an email will be sent when everything is done.

\\192.168.0.20\kape\kape.exe --tsource c --target RegistryHives,FileSystem,Symantec_AV_Logs --tflush --tdest C:\temp\tout --mdest C:\temp\tout\mout --mflush --module KAPE_Automation --mvars module:SEPM_Logs,Mini_Timeline,Mini_Timeline_Slice_by_Daterange^mvar:dateRange:06/19/2020-06/12/2020◙computerName:Collection --vhdx %m --scp 22 --scu KapeSFTP --scpw NrsxPmU8XWe72WBs --scs 192.168.0.20 --debug --trace


Empty case folder on collection server.

No email.

Collection complete and uploaded to server.

Parsed Symantec AV logs.

Parsed timeline.

Email sent upon completion.

Conclusion:

I hope this helps to setup remote collection and parsing whit KAPE. If there are any ideas or suggestions to help improve the automation of KAPE, please leave a comment. You can also leave an issue or pull request at the GitHub page.


Tuesday, January 14, 2020

One of these VBNs is not like the other

In a previous post Symantec Endpoint Protection VBN files, I described the file structure of VBN files that contained quarantined files and the process to extract them. It turns out, there is another VBN file with a different structure, that can contain quarantined files. These files reside in the Quarantine file folder, but not in a sub directory. The easiest way to tell that they hold quarantined files is by there size compared to the other VBNs in the folder. In the screenshot below, we can see that something is not quite right with 1C980000.VBN.

















These VBN files start off like any other VBN. We can grab the first four bytes to find the offset to the Quarantine File Meta header (QFM). Instead of finding the QFM header, we find a different structure instead. This structure is also xored with 5A. (Note. This is one example. I have other files that do not follow this format. Further investigation is needed)
























Examining the structure, we can see that there is another offset that leads to the beginning of the quarantined file and another offset showing the end of the file. With this information, we can extract the quarantined file for further examination. All we need to do is take the QFM offset and add our new offset to it. This will be the beginning of the file. To find the size of the file, we subtract the QFM offset form the file offset and subtract that from the EOF offset. Now that we know where the file starts and ends, we can extract the contents and XOR it with 5A.

I have also updated DeXRAY to handle these files.


Friday, June 14, 2019

Introducing SEPparser

SEPparser was created because I could not find anything to parse Symantec's Endpoint Protection logs into a human readable form. I was fairly successful with MS Logparser but it couldn't parse all the logs correctly. It did not make sense to me to have to go into SEPMC to query logs when they were right on the endpoint. These logs  contain a wealth of untapped information that can be used during an investigation. I hope you find it useful.

SEPparser is a command line tool for parsing Symantec Endpoint Protection logs. You can either feed it a single file or an entire directory. This even works remotely. SEPparser will figure out what log it is and parse it correctly.

Symantec logs are in the following locations:
C:\ProgramData\Symantec\Symantec Endpoint Protection\CurrentVersion\Data\Logs
C:\Users\%user%\AppData\Local\Symantec\Symantec Endpoint Protection\Logs


SEPparser.py -h
usage: SEPparser.py [-h] [-f FILE] [-d DIR] [-o OUTPUT] [-a]

optional arguments:
  -h, --help            show this help message and exit
  -f FILE, --file FILE  file to be parsed
  -d DIR, --dir DIR     directory to be parsed
  -o OUTPUT, --output OUTPUT
                        directory to output files to. Default is current
                        directory.
  -a, --append          append to output files.

By default, all csv files will be placed in the directory SEPparser is run from. You can also designate a folder to store them in with the -o option.

After running, the directory should look like this:
The csv files correspond to the logs you would find in the SEP gui on the endpoint. SEPparser also parses additional information out of the log that you would not see in the gui. The Symantec_Timeline.csv is the combined results of the daily AV logs and the AVMan.log. As an example, lets look at a risk entry in the SEP gui. This all the information you will get.
Lets see what additional information we ca get with SEPparser. SEPparser will give us information like company name, file size, file hash, product version, and product name.



We can also find the signing certificate information.



In addition to the log files, a packet.txt file is created. This file is a hex dump of all packets from the packet log and can be viewed with Wireshark.
In Wireshark go to File > Import from Hex Dump...















Select the paclet.txt file and click Import




















You can now view the packets and save them in a pcap if you choose





























Download
https://github.com/Beercow/SEPparser
https://github.com/Beercow/SEPparser/releases

Tuesday, April 2, 2019

Copying locked OST files

When trying to copy ost files that were in use I was running into the following error:

esentutl.exe /y /vss <file_to_copy> /d <file_to_save_as>

Operation terminated with error -1 (JET_wrnNyi, Function Not Yet Implemented) after 4.390 seconds.

The reason being, Windows VSS engine ignores Outlook's .OST files.














To work around this, the OutlookOST value must be deleted from HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\BackupRestore\FilesNotToSnapshot.

reg delete HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\BackupRestore\FilesNotToSnapshot /v OutlookOST /f

Once this is done, the file can be copied.

esentutl.exe /y /vss <file_to_copy> /d <file_to_save_as>

And then the value can be restored when the file is done being copied.


reg add HKEY_LOCAL_MACHINE\SYSTEM\CurrentControlSet\Control\BackupRestore\FilesNotToSnapshot /v OutlookOST /t REG_MULTI_SZ /d $UserProfile$\AppData\Local\Microsoft\Outlook\*.ost /f

Monday, April 1, 2019

All things Symantec

This post contains information on my research into Symantec logs and quarantine files. Content will be updated regularly.

Symantec Endpoint Protection Logs

Symantec Management Client (smc) does not show the entire contents of the log. smc.exe has an -exportlog commandline switch where you can select a log type to export.  Log_type numbers are as follows:
  • 0 = System Log
  • 1 = Security Log
  • 2 = Traffic Log
  • 3 = Packet Log
  • 4 = Control Log 
These numbers also correlate to an entry in the header of the logs found in C:\ProgramData\Symantec\Symantec Endpoint Protection\CurrentVersion\Data\Logs.
  • 0 = syslog.log
  • 1 = seclog.log
  • 2 = tralog.log
  • 3 = rawlog.log
  • 4 = processlog.log

Log File Structure

Symantec Endpoint Protection VBN Files

Folder structure makes a difference in what is contained in the vbn file. SEP quarantine files are located in C:\ProgramData\Symantec\Symantec Endpoint Protection\CurrentVersion\Data\Quarantine. In the quarantine folder, there is a vbn file and a folder with the same name as the vbn file.


Wednesday, December 5, 2018

Comparing Packet Captures to Procmon Traces Revisited

In my previous post, Comparing Packet Captures to Procmon Traces, I demonstrated how to match Procmon to pcap data. When I looked at this again, I noticed something peculiar. When Procmon's shows a length of 3760, everything gets thrown off.



Looking at the output from Procmon and TCPdump, everything matches up until we hit a length of 3760. So what is happening here? It turns out, if you want to match the packets up, one of them needs to be split.

So it turns out there is an exception to the rule. If the length equals 3760, we have to add the length of the next entry to it. The packets in TCPdump should add up to this combined number. Looking at the example, the third packet will be split between the two Procmon entries.

Friday, August 3, 2018

Windows 10 Notification WAL database

David Cowen recently wrote and article about revisiting the Windows 10 Notification database. From my observations, the database is in Write-Ahead Logging mode. The wpndatabase.db-wal file can contain deleted entries. I came up with a way to view the wal file.

I forked a python script (Walitean) because the endianness of the integers was wrong. With my forked version, you can convert the wal file into a sql database to view by doing the following:



Once the wal file is converted, you can run the following sql query to parse the database:

SELECT unknown0 AS Id, unknown1 AS HandlerId, unknown2 AS ActiveId, unknown3 AS Type, unknown4 AS Payload, unknown5 AS Tag, unknown6 AS 'Group',
 datetime((unknown7/10000000)-11644473600, 'unixepoch') AS ExpiryTime, datetime((unknown8/10000000)-11644473600, 'unixepoch') AS ArrivalTime ,
unknown9 AS DataVersion
FROM IIBTBUUIIU



My forked version of  Walitean can be found here:

https://github.com/Beercow/walitean