Kevin Neely's Security Notes

A place where I can post security-related long-form thoughts, notes, and articles

This is a log of experiences and experimentation in moving from more traditional home computing –ATX cases, components, water cooling, and continual upgrades– to something a bit more modular in terms or GPU computing power. This guide probably isn’t for most people. It’s a collection of notes I took during the process, strung together in case they might help someone also looking to pack multiple power-use-cases into as small a format as possible.

[Note:] A later evolution should involve a similar down-sizing of a home storage appliance.

Objectives

An external GPU requires more setup, and -let’s face it- fiddling than getting a gaming laptop or a full PC case that can handle multi-PCIe slot GPUs. So why do it? A couple objectives had been bouncing around in my head that led me to this: – I need a system that can run compute-intensive and GPU-intensive tasks for long periods of time, e.g. machine learning, and training large language models – I need a light laptop for travel (i.e. I don’t want to carry around a 5+lb./2.5 kilo gaming laptop) – I want to be able to play recent games, but don’t need to be on the cutting edge of gaming – I want to reduce the overall space footprint for my computing devices.

In summary, I want my systems to be able to handle the more intensive tasks I plan to throw at them: Windows laptop for gaming and also travel, the stay–at-home system can perform long-running tasks such as AI model training, password cracking, and daily cron jobs.

Things I don’t care about: – being able to play games while traveling – document data diverging due to on multiple systems: I use a personal #NextCloud instance to keep my documents in sync.

Current State

I have a number of personal computing devices in my home lab for testing things and running different tasks, but they’re all aging a bit, so it is time to upgrade: – my Razer Blade 13 laptop is from 2016 – my main tower/gaming PC is from 2015 with an Nvidia GTX 1060 – an i5 NUC from 2020 (unused) – an i3 NUC from 2013 (unused) – A 6TB NAS with 4 aging 2TB drives from 2014 – Raspberry Pis and some other non-relevant computing devices

Configurations

With the objectives in mind, and realizing that my workload system would almost certainly run Linux, the two configurations for experimentation were: – Intel NUC with an eGPU – Lightweight laptopi (e.g. Dell XPS 13) with an eGPU

[Note:] The computing systems must support at least Thunderbolt3, though version 4 would be best for future-proofing.

Shows an Nvidia GTX 1060 in a Razer Core X Chroma eGPU enclosure Image: Original GTX 1060 GPU slotted in the Razer Core X Chroma enclosure

Background Research

Before starting on this endeavor, I did a lot of research to see how likely I’d be able to succeed. The two best sources I found was the eGPU.io site with many reviews and descriptions of how well specific configurations worked (or didn’t). They also have nice “best laptop for eGPU” and Best eGPU Enclosures matrices.

Nvidia drivers and Ubuntu

Installing Nvidia drivers under #Ubuntu is pretty straightforward these days, with a one-click install option built-in to the operating system itself. The user can choose between versions, and my research showed that most applications required either version 525 or 530. I installed 530.

eGPU information

The best two sources I found for information on configuring and using eGPUs were: – r/eGPU on reddit – their “so you’re thinking about an eGPU” guideegpu.io

Proof-of-concept

Having read a fair amount about the flakiness of certain #eGPU setups, I approached this project with a bit of caution. My older tower had a respectable, if aging, GTX 1060 6GB in it. Since I already had a recent Core i5 Intel NUC running Ubuntu and some test machine learning applications, so all I needed to fully test this was the enclosure. Researching the various enclosure options, I chose this one because: – the Razer Core X series appears to have some of the best out-of-the-box compatibility – I’ve been impressed with my aging Razer laptop, so I know they build quality components – The Chroma version has what is basically an USB hub in the back with 4 USB 3.x ports and an ethernet jack added to the plain Core X version My thinking was that this system could not only provide GPU, but also act as an easy dock-hub for my primary computers. This didn’t work out quite as I planned (more in the next post).

The included thunderbolt cable is connected from the NUC to the eGPU. Theoretically, the standard peripherals (keyboard, mouse, etc.) should be connected to the eGPU hub and everything will “just work”. However, in my testing, things worked best with the peripheral hub I use plugged into the NUC and only the #Thunderbolt cable plugged into the enclosure. In the spirit of IT troubleshooters everywhere: start by making the least amount of change and iterate from there.

Intel NUC on top of Razer Core X Chroma eGPU Image: Just the enclosure with a NUC on top.

Experience

The NUC was on Ubuntu 20.04. The drivers installed just fine, but the system just wouldn’t see the GPU. Doing some research, it looked like people were having better results with more recent versions of Ubuntu, so I did a quick sudo apt dist-upgrade and upgraded the system to 22.XX. The GPU worked! However, the advice I’d been given was to upgrade to 23.04, so I did that and still the system worked fine.

TL;DR

For the impatient, here is the final output from using ChatGPT to create service description documentation: – Final doc in Markdown format – (GitHub Gist) – Final doc in PDF format – (Box)

Overview

I’ve been working on building out an internal offensive security function and got to the point where I need some internal documentation as to the service(s) description, engagement model, outcomes, etc. Like a lot of planning, I started with an Xmind #MindMap, and with all the buzz around #ChatGPT, I wanted to see how well it could take what I have and build some docs for me.

In addition to Xmind, I use Obsidian for note-taking, jotting down thoughts, and organizing documentation. The object of this exercise will be to have #Markdown formatted text that I can make final edits in Obsidian and from there publish to a documentation repository.

ChatGPT Prompts

First prompt

I started with a pretty robust prompt. With the exception of the Objective paragraph at the top, this was 100% copy-paste from Xmind to the ChatGPT prompt.

Objective: I want to create a service description and engagement model for an internal red team. This service description should be formatted like a document with section headings and subheadings. Format the output in Markdown. The service components will be as follows: - the team will manage the scheduling process - the team will determine the activities that must be completed to adequately test the target - the team will determine whether a specific test should be carried out by an internal team or if an external testing firm needs to be engaged Pre-requisites for initiating a test include: - a system architecture diagram - a completed threat model document - access to the systems that will be tested The activities performed by this function will include the following items Document processes so they are Repeatable Defendable Create and perform Reporting with the following characteristics templates regular cadence tied-in to the risk review process findings go into backlogs, jiras, and project plans, depending upon the team and target demonstrated ability to remediate findings 

Engagement with developers must include 
  Regular sessions with R&D and IT teams that are developing tools and systems  
  Demonstrations 
  Threat modeling 
  
Engagement with System Operators must include 
  Educational review sessions with teams 
    system operators 
    Blue team 
    system integrators Demonstrations of exploit Collaborative remediation sessions Threat modeling

and the output ChatGPT output 1

Second prompt – measurable outcomes

Now I need to define some measurable objectives of the program in #OKR format so we can measure its success. The results were pretty meh. I’ve tried OKRs and #infosec success criteria in other chats and they pretty much all come out the same. This section had to be heavily edits foor the final. Luckily, I’d already outlined them in the mindmap.

ChatGPT OKR prompt and response

Prompts 3 thru 5 – flowchart

Now that I have a working service description and some needs-editing OKR (all a bunch of words no one will read), I need a flowchart diagram. I’m a fan of mermaid diagrams for their usefulness across various markup languages like Markdown and RestructuredText.

prompt

based upon the service description above, create a flowchart of the entire process beginning with scheduling and ending with concluding a set of tests. Output this flowchart in Mermaid syntax.

output

graph TD
A(Schedule Test)
B(Review Prerequisites)
C(Plan Activities)
D(Determine Testing Resources)
E(Perform Testing)
F(Record Findings)
G(Produce Report)
H(Engage with Developers and System Operators)
I(Conclude Tests)

A --> B
B --> C
C --> D
D --> E
E --> F
F --> G
G --> H
H --> E
H --> I


In this flowchart, the process begins with scheduling a test (Step A), followed by reviewing the prerequisites for the test (Step B). The team then plans the activities required for the test (Step C) and determines whether internal or external testing resources are required (Step D). The testing is then performed (Step E) and the findings are recorded (Step F). The team then produces a report (Step G) and engages with the developers and system operators (Step H) to remediate any issues. The process concludes with the finalization of the tests (Step I).

This was as decent start, and I had a couple iterations:

  1. adding a decision add a decision to the chart after step E in order to determine whether the completed tests are adequate to move on to testing completion
  2. inserting some more steps In the Mermaid flowchart above, tests need to be concluded prior to producing a report. Also, add a step called "Send to Product Management and Security Leadership" occuring in parallel with the "Engage with Developers and System Operators". Also, conclude with a step called "apply recommendations to target systems"

There were still some errors at this point, such as ChatGPT insisting on putting the “Conclude Tests” step at or very near the end, even though it’s really about the mid-point of the workflow. I could have continued the prompts, but at this point I decided to finish this off by hand.

Prompts 6 and 7 – RACI

Now that the processes are defined, we need to identify the responsibilities of each of the roles and how they interact.

ChatGPT prompt requesting a RACI for an offensive security program

From this, it’s clear that ChatGPT is good at formatting and generating a RACI matrix in #Markdown format, and this version has reasonable values for the roles, even if I don’t agree with them. Still, it was useful enough to have a good foundation that I could tweak, but before that, I added one last parameter:

add a new role to the matrix called Red Team Manager. This role should be responsible for teh scheduling and accountable for everything else. Add two new processes called Review Findings and Remediate Findings. The Developer is responsible for the former and the System Owner is responsible for the latter. Recreate the RACI matrix with these new parameters and output the Markdown code.

And this changed the RACI to basically make the manager accountable for everything.

Finishing up

At this point, I felt like I had the elements I needed, so I began the process of copy-pasting them from the interface into Obsidian and making tweaks to get a usable service description document.

The final output from using ChatGPT to create service description documentation: – Final doc in Markdown format – (GitHub Gist) – Final doc in PDF format – (Box)

Migrating PasswordSafe to KeepassXC

I’ve been a longtime user of #PasswordSafe (or, “PWsafe”), back since Bruce Schneier was managing authorship and maintenance. With all the issues experienced by online providers like LastPass and 1Password (but especially LastPass, by miles), I think the usage of a local password database with sync to a personal #NextCloud instance is the way to go. I’m happy with PWsafe; it’s worked well over the years, but I need to share a few passwords and would like some expanded functionality such as managing SSH keys, so I looked to #KeePassXC, which appears to be the most up-to-date and maintained branch of the KeePass and KeePassX family. KeePassXC is desirable because it is natively multi-platform, whereas the original KeePass is written for Windows, and emulators are required to use it on operating systems like Linux.

Importing passwords

There is no direct import from a PasswordSafe format to KeePass database format using KeePassXC like there is from LastPass to KeePass. A tab-delimited file can be exported from PWsafe, and KeePassXC can import a comma-delimited (“CSV”) file, however, I make heavy use of nested groups, and the work to prepare the CSV file looked like a major pain. Luckily, the original version of KeePass supports direct import from PWsafe.

Armed with that knowledge, this was my path to import my passwords 1. Open PasswordSafe and export the database in the XML format (be careful with this file and delete when done!) 2. Download latest KeePass 2.x from https://keepass.info/ 3. Open KeePass, create a new KeePass version 2database, and import the XML file 4. Export the file as KeePass version 1.x database format 5. Close KeePass 2.x 6. Open KeePassXC and create a new database in a temporary location (doesn’t matter, we wont’ use it) 7. Import the KeePass 1.x database with the passwords 8. When prompted, choose the location and name where you want the database 9. Done!

KeePass import dialogue box

Finishing Up

Make sure to explore the settings, such as adding a Yubikey and/or keyfile. When everyhing is as you want it and working, delete the interim files (XML, KP 1.x and 2.x databases), and make a plan to retire the old PasswordSafe data.

References