Infosec Press

Reader

Read the latest posts from Infosec Press.

from J. R. DePriest

By the Lake

I read everything at the info kiosk of the Lake Ochonkmah Otter Lodge.

It used to be a hunting shack for otter hunters but was abandoned sometime around 1900.

In 1943, a husband and wife research team, Drs. Bartholomew and Candice Burroughs “rediscovered” the location while hiking around the lake and studying the local otters, which were rumored to be particularly sociable and friendly. They made camp on the site and made note of its location. Over multiple trips it became a bonafide research station and was repaired, built-upon, and expanded.

It was their life's work for 30 years and they developed a niche following among otter aficionados. The otters at Lake Ochonkmah were very friendly and completely unafraid of humans. The Burroughs speculated that the hunting had been easy and they never had a good explanation for it being abandoned.

One theory, borne out by examining remains, was that a mystery illness had thinned out the otters and wiped out the humans who knew about the location, leaving it free to recover and flourish.

The only reason to visit this place was to watch the otters. The water was far too cold year round to be comfortable for swimming and there were precious little in the way of game fish left after the otters had their fill.

Still, a small town grew up during the height of the Burroughs' research, a country store for supplies and a bed and breakfast style boarding house for transients and travelers.

On October 15, 1975, Candice died in her sleep at the age of 61. There was no warning as she had been working with Bart the day before and gave no indication of being sick. Bart became understandably withdrawn and depressed and focused on his work. Less than a month later, he died in his sleep, as well, on November 22, 1975. He was 63.

Without the support of the doctors, the research station fell into disrepair and what little tourism there had been stopped entirely.

In 1987, the millionaire Margo Fillings swept in like a tornado and revitalized everything.

She never said why she was so passionate about this place, but she turned the old research station into an overnight learning experience and encouraged schools to bus kids in to learn all about the Ochonkmah Otters.

The general store was re-opened with a more worldly selection of goods, snacks, candy, soda, and the like.

The bed and breakfast was remodeled into a proper family restaurant with the rooms being used to house the staff.

A motel opened just outside of town to accommodate any other travelers.

The rest of my class was still in the observation room where it was kept dark so you could peer out the long glass floor to see the otters in their natural habitat.

I was out in the well-lit hallway, trying to talk to the guide, but she kept ignoring me, telling me she had somewhere she had to be and going back and forth between an office in the back and checking on the observation room.

She ran back and forth and back and forth.

She had to squint to see inside the observation room and she'd look inside and shake her head.

And then scurry back to the office.

I didn't want to go into the observation room so I stayed out in the big lobby and read the infographics again.

Margo Fillings was the savior of the town according to the infographics.

She looked like a gymnast in her photos: short build, athletic, with thick legs, an attempt at a pixie cut but her red hair was too curly to stay down. Always smiling. Always looking directly at the camera.

My legs were thick, too, but so was the rest of me. Not so athletic. Sometimes, my legs would stop working and I'd have to sit down or lie down, but that didn't happen very often.

When the other kids from my class started to filter out of the observation room, I was looking for Angela and Angie, the best friends I'd ridden down with.

Angela was really smart, good at math like I was, but also good at music which I wasn't. Angie was an artist and barely passed any other classes, not because she couldn't but because she didn't feel like it was worth the effort.

They were my friends, my only real friends.

I had trouble making friends because I was prone to talking too much or saying the wrong thing. I did that all the time. I said the wrong thing and people got mad, but never told me why they got mad.

Angela came out and she was rolling her eyes while walking toward me.

“Angie found a boy,” she told me.

Angie would latch onto a boy and obsess over them.

Then she would date them, get to know them, and suddenly get over them.

Angie came out with her arm wrapped tightly around a tall boy's waist.

He wasn't even handsome or pretty. He had stringy hair and his clothes were too baggy. He looked dirty.

“He plays the fucking guitar in a band,” Angela told me.

That explained it, apparently.

Everyone else left, teachers, chaperones, students.

Everyone left except for Angela, Angie, Angie's new obsession and his “bandmates” who were just as dingy and he was.

Angie was pale and raven-haired like an angel might be, but she preferred to wear black, even her makeup was black.

Angela wore light blue slacks and a silk blouse. She was always so exquisite.

We stayed two more days, at the motel outside of town.

On the third morning, Angie was gone.

Angela told me that she'd left with the band and we'd be lucky to see her at all for a few weeks.

She was 17 and she liked to pretend she was an adult.

Angela was really quiet that day.

I think Angie didn't tell her that she was leaving with the band.

When I woke up the next morning, Angela was gone. Her clothes, her toiletries, her bag, and her car were all gone.

I walked back to town and into the Otter Lodge.

I walked in and told the lady who worked there, the same one from the overnight visit, that I was lost.

She asked me my phone number and I didn't know.

She asked me for my parent's names and I didn't know.

She asked me for my name and I didn't know.


According to Margo Fillings, the anomaly was here on her first visit to the town, back when she was first considering pouring her resources into it.

It looked like a teenaged white girl. Limp brown hair, a little pudgy, a little slow witted, but it spoke like a normal teenaged girl and it was wearing normal clothes.

She thought it was a mannequin because it was motionless, not breathing or moving. Its eyes were wide open, not blinking.

She touched it.

The skin was warm to the touch but stiff.

It shivered at her touch and immediately became supple.

Its chest began to rise and fall. When she looked at the face again, the eyes had closed. It appeared to be sleeping.

She assumed it was a runway and woke it up.

It's first words were, "Hi, Margo!"

Margo says she maintained her composure, but "citation needed" you know.

When she asked it "What's your name?"

It replied something like, "Don't be silly; you know who I am."

So she gave it a name, "Lillian", after the flower, and it took it.

We know this because Margo kept a journal. I've read it. It's practically Exhibit A.

The journal says "I said the first name I could think of. I remembered seeing lilies out front, so I called her Lillian. It was a question, I asked her if her name was Lillian and she agreed that it was. That wasn't what I was asking, but she just accepted it."

But if you ask Margo about it now, she will tell you that the anomaly is, in fact, "Lillian Harper" and that she was always "Lillian Harper" and that they knew each other before she found her in the back of the research building gathering dust, before she gave her a name.

One time, a guest was here with her fiancé.

She was so kind, wearing white to contrast her dark wavey hair. She spoke like a poet, it was mesmerizing. She wrote about the trees and the flowers and the lake and the otters.

She found beauty everywhere she looked. Decaying leaves, moss, and mushrooms covering a fallen tree trunk. An otter's corpse washed up on the shoreline. The sun on her face and in her eyes. Storm clouds flashing in the distance. The sounds of the wind blowing the ghost lights over the water at night.

Her fiancé was comparatively grumpy. He was a writer, too and they thought this place would inspire them both.

For him, it was uncomfortable, aggravating his allergies, covering him in ants and spiders, spoiling their food. He only saw unnecessary turmoil.

There can be beauty in unexpected difficulties, right?

She saw it. Her eyes sparkled with it; her soul glowed and reveled in it.

She was kind to me, even though I couldn't walk.

I was in a chair most of the time. I would be in my spot in the chair outside the old research center in the morning and back in my room at night.

My arms worked, my lungs worked. I could breath and speak and think and smile. But my legs felt like nothing at all. Like empty shells filled with dirt. Like anchor weights tied to my pelvis.

I told them stories about the otters and about the people who used to work and live here.

I told them about the Drs. Burroughs and how they both died but nobody knew why. I thought it was the sadness.

This place had a sadness about it, always, but people would come and cover it up and ignore it.

They would find the life, the singing of the insects, the splashing of the otters, the waving of the trees, and ignore the emptiness underneath it.

They would study and sleep and observe and feel and love and eventually it would find them.

They would wonder where it all went and why it took so long to notice it was gone.

“Melancholy” they called it.

She thrived and grew and blossomed.

He withered.

All he left behind was a perfect bouquet of white lilies.

She threw them on the ground right in front of me.

They didn't wither.

They flourished.


They've had to send multiple agents because every other agent eventually believes the lies.
First question I asked? "Why not take it to a real lab instead of doing all the study here at a compromised location?"

Answer: Any attempt to remove the anomaly from the site results in tremors that get worse the further away it's taken.

So they keep sending us and once we stop sending in updates, they come and get us and send in someone else.

I've seen the photos and the records of the examinations of the anomaly and it definitely is not human.

It has the outward appearance of a teenage girl, but only superficially.

It's anatomy has been thoroughly detailed while it was in its dormant state.

Constant body temperature of 96° regardless of the outside conditions.

Smooth skin resembling that of a typical Caucasian but only from a distance. There are no pores and no body hair, not a single blemish. The skin cannot be cut or punctured using any methods we've devised and it doesn't bruise. There is no evidence of veins or blood flow of any kind, no pulse at all.

It has the shape of breasts but no nipples. It has buttocks but no anus. Instead of a vaginal canal and urethra, it has a shriveled phallus with no openings. There are no visible testes.

The head appears almost entirely human. It has nostrils that seem to lead to a nasal cavity. Eyes with tear ducts that react as expected to light even when it is dormant. It has eardrums and eye lashes and eye brows. All the hair on the head seems to be attached as expected even when the rest of the body has none at all. It has a mouth with the correct looking teeth, a tongue, a trachea and esophagus, but its internal structure remains a mystery.

Endoscopy hits impossible dead ends when run down either throat tube.

It doesn't breath when it's dormant so we aren't even sure if it needs air.

While dormant, it has been submerged in water for prolonged periods without any ill effects.

We have observed that when it returns to its active state, sometimes only parts of the body revive fully, such that it appears to be paraplegic or quadriplegic. It compensates by entering a semi-dormant state and "floating" between locations. Even when done in full view of locals, none of them recall seeing it happen.

It has never demonstrated this ability while fully active, only while semi-dormant, a state that resembles "sleeping".

One time, my friends put on a musical production of Grease.

They know that I love musicals and singing even if I can't participate.

They staged it around the Lodge so I could sit out front. Even though I couldn't walk and had trouble speaking, they made sure I felt like part of the show.

I was able to move my arms and smile to “You're the one that I want!”

“Oh yes, indeed!”

The spectators noticed and the cast sang “We go together” to me while I was able to shift back and forth.

It elevated everybody's spirits.

After the musical was over, after the people had said their goodnights to each other and to me.

After I basked in the feeling of accomplishment and acceptance, I drifted off to sleep.

I dreamed of swimming in the lake. The water is far too cold for swimming in real life, but it was warm in the dream.

I was so far out that I couldn't see the shore on either side. The lake isn't that big, but in the dream it was.

I wasn't afraid. I wasn't afraid at all because the water was calm and it supported me.

Deep, deep below me, I could see lights and motion as if an entire city were down there.

I wanted to go down there. I wanted to see who it was.

But my head refused to go under the water. I would try but the water would push me back. The water wanted me to stay up here.

I could almost hear them building something, creating something marvelous.

But it wasn't for me.


We've checked air, water, food toxicology. We've bagged insects and plants for allergens, poisons, or venoms.

We've run up antenna to check for electromagnetic sources, Geiger counters for radiation, specialized microphones for ULF, ELF, UHF, and EHF.

Nothing.

We've even had Astrologers, Diviners, and Ley Line experts check it out.

The local Native Americans were, unfortunately, driven out and killed by settlers long ago. The only record we have is the name of the lake itself, "Ochonkmah", which looks like it's derived from something Native American but is too bastardized for a direct translation. It resembles the Choctaw word *achukma* which has positive connotations of "good" or "pleasing".

The only other anomaly is a strong magnetic source out in deeper waters, assumed to be an ancient meteorite. It's far too cold and deep for regular divers and we've yet to get permission to field a top-of-the-line manned submersible or ROV. Camera and robot claws we drop on lines inevitably hit snags. Cameras show significant debris around the site. What artifacts we've hauled up matches what we would expect from Viking long boats. There is no good reason to find that sort of debris at this location.

The magnetic source could help explain the ghost lights which are known to float over the water during particularly warm autumns.

We've caught them on film multiple times with various cameras. The purple glowing globules read very similar to St. Elmo's Fire. They cannot be ignus fatuus due to the lack of flammable gas. We've yet to have a boat on the water fast enough to observe them up close.

It had to be a dream, but it felt so real. It had to be a dream since nobody else saw it.

There was a festival in the main yard, but I was on the shore looking out over the lake.

I saw a silvery disk come out of the sky and make as if to land on the surface.

Before I could think, I was in the water, swimming with all my might toward it.

The water is too cold to swim in, but I was swimming and it wasn't that cold.

I never swam so fast before. My legs worked better than they ever had and pushed me forward while my arms carved great handfuls of water over and behind me, like I was climbing a mountain of snow.

I reached the disk and it was tiny, no bigger than a Frisbee.

I was certain it had been a spaceship but here it was no more than a toy.

I stopped swimming and found I could stand. The water out here should be quite deep but I stood up.

I looked back toward the shore and saw a tall, purple skinned humanoid motioning for me to pick up the disk.

His skin was dark and smooth, leading to thin arms that moved more like tentacles than something with bones and joints. His head was round like a matchhead and his eyes were black.

His slit-like mouth was smiling.

I'm not sure how I could tell it was smiling, but it was.

I picked up the disk and he pointed out further into the water.

I turned and saw a massive blobby creature, like something made of the squishy from the bottom of the lake.

It was rushing away from us with a massive crooked wake.

Parts of it seemed like stones or rocks and two of them turned and I saw they were huge eyes, watching my hand holding the disk.

I held it close to my chest and flung it out like a Frisbee and the blob leaped after it, a giant mass of barely held together pieces shaped into a huge dog's head on a turtle's body with flippers for legs.

It grabbed the disk in its mouth and collapsed back onto the surface of the water with a SLAP.

The purple creature was applauding me and motioned for me to come closer.


Occasionally, this place hits the news cycles and tourism has a temporary boost.

The businesses open back up, people show up to run them. People show up and buy tickets, souvenirs.

The otters get to entertain a new batch of people.

It goes like it always goes here.

It's great at first, then it gets rougher, then it gets angrier, then something bad happens and it dries up.

Disappearances usually.

Maybe murder but no bodies are ever found.

Rumors start to flow.

People get afraid again.

It goes dormant.

The anomaly is always part of the revival and she reacts very poorly to the negative happenings.

She plays really hard at being upset and not understanding why people can't be kind and get along.

There is no way to know if it is "genuine" sadness as she isn't human to begin with.

She's very convincing and seems to know intimate details of the lives of those who live here. She can speak to their wants, needs, dreams, fears, weaknesses, everything as if she is their best friend in the whole world.

She knows things about me that I won't put down in a report.

She knows things about our research that she shouldn't. When she gets deep into esoterica, her voice changes a bit, becomes monotone, almost like she's reading a script.

Ask her about it afterwards? She claims she doesn't remember and seems to freak out if you play back a recording of it.

I'm not sure how we can keep things from her as she seems to know everything that happens around the lake, including internal thoughts that are never voiced or written at all.

It may be too dangerous to continue the investigation and we may want to write off our losses and leave it be.

I thought I was like a daughter to her.

She took me in and I lived and slept under the same roof as Imelda, Margaret, Stephanie, and Beatrice.

I was there for their first loves and their first heartbreaks.

I was there when they wondered what the point of it all was.

I helped them find meaning. I helped them understand the nature of people and of men.

I thought they would be strong enough to go out on their own, but they always went back to someone.

They seemed to not know themselves unless they were supporting a man.

It was sad and I told Mrs. Glenn it was sad and she agreed with me.

Mrs. Glenn and I wanted the girls to be self-sufficient like she was.

She raised all four girls without a man and she did a fine job.

Being the proprietor of the restaurant meant she had room and board for them as long as they worked.

She never did put me to work on the floor and she never told me why not. I asked and asked until eventually I stopped asking.

But I helped her with my stories and with my advice.

I told her about the history of the lake and the research station and the fur trackers and the otters.

I told her about the ghost lights and about the silver disk that came down from the sky.

I told her about the riches that had been lost time and again by strange ships that should never have tried to sail.

I told her about the plants and insects and which ones were safe and which ones were to be avoided.

She spun those into the recipes a little at a time, spreading good cheer and health with each meal sold.

When Imelda left, no note, just all her things gone and her and her boyfriend nowhere to be found, she came to me and I had no answers.

Imelda hadn't confided in me. None of the girls confided in me anymore.

When I asked them why, they told me they “outgrew” me and that was that.

But Imelda had been distant for a long time, keeping to herself.

Margaret was learning how to cook the special recipes with her mom and Mrs. Glenn couldn't be happier.

It made no sense for her to be the next to leave without a word.

But she was gone. Her clothes were gone. Her man was gone.

Imelda had never called and she expected the same from Margaret.

She didn't asked me for advice this time.

She didn't talk to me for a long time.

Not until Stephanie was the next to go missing.

She talked to me “before Beatrice went away,” she said.

She told me she knew what was happening and she thought she was paying her dues.

She thought she was doing what was required by making the recipes and serving them.

She said her missing girls sang dirges to her from the deep water.

They sang to her and told her that she failed them and failed everyone and that the lake would take its price one way or another.

She told me all this like I could do something about it, like I was part of it.

I didn't understand, but I asked her what she thought the price might be and if she thought it was worth it, if she thought she might be willing to pay it if she knew what it really was.

I asked her that question because I wanted her to figure it out on her own. I wanted her to think about what was important to her. I wanted her to recognize the love she had for her daughters and how that was clouding her judgement.

I didn't know what she would do.

She drowned herself in the lake.

Beatrice took over the restaurant.

I was sent back to the research station.

She never talked to me again.


The otters here are another part of the anomaly. They are obsessed with humans. They study us just as much as we study them. They've formed a particularly large raft and maintained it over generations, which is quite unusual. The males and females and the offspring all seem to stay close. There are so many of them that, even though the lake is very large, there are very few game fish left for anglers to catch.

Some say they should have run out of food by now, but they obviously haven't.

They don't seem to be any smarter than other otters, but they ratchet up the curiosity.

So when they suddenly pulled away from the shore where the settlement was located, it was odd.

They were acting strange. The locals didn't seem to care much, even though much of the tourism relied on them.

It was doubly unfortunate because Lake Ochonkmah and the Otter Lodge had been featured on some popular podcast which got it recognized by real celebrities who were stopping by for photo ops.

I think after Tiger's birthday party, though, that the tourism will die back down, due to all the bodies they found.

The only one who noticed or cared about the otters was Melanie.

She definitely noticed during the birthday party and tried to get Mr. Fletch, who runs the tours, to do something, but he wasn't there.

Once he got back, I think she sent Axl Fucking Rose up there to talk to him. I was close enough to hear Mr. Fletch yell at him, telling him to mind his business and that he didn't care if they were sick as long as they were still in the water.

When I left the lodge, I noticed a white and red helicopter in the yard that had not been there before. Further away, toward the road I saw an area cleared of trees and a small, personal airplane was parked, also white and red. Toward the lake was a white and red jet ski. I noticed they all had little red ribbons on them and thought they must be presents.

A group of people were walking in from the road, surrounding an attractive black man in white slacks.

I recognized him: Tiger Woods.

I was excited that our little outpost was so famous that someone like him would visit and I realized it must be a birthday party.

I went down to the water to stay out of the way.

I wanted to see if the otters had returned to this side of the cove.

Once I scrambled down and got a better look, I saw that they were still as far away as they could be without going out into deeper water.

Additionally, they seemed to be agitated and moved in jagged bursts in the water.

I climbed back up the embankment and went to the General Store where Mr. Fletch ran the tours.

The small desk and register were vacant. I looked at the books and post cards and the souvenirs and smiled.

A man with long red hair came in and asked me a question.

“Excuse me, but is there something wrong with the otters?”

I was so excited that someone else noticed.

“I was thinking the same thing,” I said.

“I know the local bevy has a reputation for being friendly, but even for normal otters, they seem distressed.”

He explained that he noticed their fur was matted which would terribly diminish their ability to keep warm and swim.

I explained that they were normally on the near shore and that they'd fled to the other side days ago, long before everybody else showed up.

I further told him that I wasn't able to get any of the adults to understand how grave the situation was, not even the docents at the Lodge.

He was puzzled but didn't have anything else to say.

The party was starting and I went back down to the water's edge.

I saw the nice man with red hair go back to Mr. Fletch but Mr. Fletch seemed to be angry about something so the nice man left again.

I was so worried about the otters that I decided to go to them.

I slipped into the freezing water, not shivering, but feeling my legs go numb almost immediately.

I pushed deeper and started swimming.

I wasn't fast like I'd been in my dreams.

I kept my head above the water because I knew that would be the end.

I swam toward the otters and they ignored me.

I swam and felt something touch my legs.

I treaded water and looked down.

I looked down and the water was super clear.

It was clear and I saw Angie down there.

I saw Angie, I saw Dr. Candice Burroughs.

I saw Angie, and Dr. Candice Burroughs, and Margo Fillings, and Camilla Harper the poet.

I saw Vanessa Glenn and her daughters Imelda, Margaret, Stephanie, and Beatrice.

I saw them reaching for me, their smiling faces and their long outstretched arms and fingers.

I felt them touching my feet and my legs.

I expected them to be angry, but they were at peace.

They wanted me to be at peace.

I thought about how I was never truly loved here.

Nobody accepted me.

Nobody understood me.

I was merely tolerated.

I was never part of a family.

I was never a friend, only an acquaintance.

I didn't belong here.

I never belonged here.

I belonged somewhere else.

I belonged somewhere else.

Instead of going under to where they were, I floated on the water until it was golden.

Until the sky was silver and the water was gold.

I floated and I saw a place of crystal and glass, glowing with internal light.

I saw them standing on the platform embracing and laughing.

All the woman from the water were up there and they were happy.

I floated toward them.

I wanted to join them, but the platform was too high and I wasn't allowed.

I wasn't welcome.

I sank away and wanted to cry.

I wanted to be alone and to cry.

A strange woman approached me.

Her face was plastic and her hair wasn't real.

She approached me and said, “I am your mother.”

I never had a mother before.

I heard her say “I am your mother” and the voice was pure bliss, like melted chocolate and rainbows and warm nights and the kiss of a kitten's whiskers.

I heard her and I believed her.

I believed her and I let her embrace me.


Like I said, a dozen bodies from the lake, all perfectly preserved, all women who went missing, even a couple nobody knew were missing yet.

A dozen bodies found on one hand and the disappearance of Melanie, the anomaly herself, on the other.

You know what the main office told me?

"Forget about it. It doesn't matter."

END_OF_LINE

#WhenIDream #Dreams #Dreaming #Dreamlands #Writer #Writing #Writers #WritingCommunity #ShortFiction #Fiction #Paranormal

CC BY-NC-SA 4.0

This work is licensed under the Creative Commons Attribution-NonCommercial-ShareAlike 4.0 International License

Mastodon

 
Read more...

from Bruno Miguel

A few tips for gaming on Ubuntu with the Steam snap, and my overall experience with the distro after a few days of usage

A few days ago, I switched from Arch Linux to Ubuntu. I've been thinking about starting to use a distro more focused on stability and less on having the latest everything. Ubuntu was already the operating system my wife and my father were using, mainly due to the Ubunto Pro free tier, so I decided to go with it, too, and make my life easier when giving them support.

Surprisingly, the default Ubuntu experience improved a lot since I last used it. When was it? Ten years ago? More? I can't recall, but I know it was a long long time ago, in a galaxy far far away, and I remember using Unity. I miss Unity. I did install some extensions for GNOME, changed the typeface, used my own .fonts.conf, and a few other configurations, installed and built some utilities I had used on Arch for more than five years, and that was it.

The only issue I've had is using the Steam snap package and running games installed via Heroic Games Launcher's flatpak package, with shortcuts for them added to Steam. The confinement rules set for this snap don't let it use a lot of stuff outside of it, so I can't use the same setup I had on Arch.

I could install Steam with the --classic flag, which disables the confinement. However, if I'm using Snap for isolation, I might as well take advantage of it. I could also use the deb provided by Steam, but I want to stay close to the default as much as possible so I don't have to deal with as many potential issues.

There's also the possibility of installing Steam via Flapak. I might do it eventually. But now, I want to use the Snap package and send some reports to the team. If you want to submit them, too, take a look at the Testing page at the repo's wiki, and read the instructions on how to submit carefully.

After thinking about this for a day, I remembered there's this application called Cartridges, that serves as a main hub for your gaming clients. Steam and Heroic are both supported, but for it to see the Snap package, you need to change the Steam path to /home/USER/snap/steam/common/.steam/steam. (don't forget to change USER to your username)

While this is not ideal, because I have to have another package installed, it fixes it for me. My main game hub is now Cartridges.

For custom Proton forks, like Proton-GE, ProtonUp-QT supports the Steam snap package out-of-the-box.

You can also use different Mesa environments with the package. The repo's wiki has all the instructions. Shoutout to Diogo for mentioning this to me and for giving me a few tips that helped make the transition to this distribution easier.

#Linux #Ubuntu #GamingOnLinux

 
Leia mais...

from beverageNotes

I've been recently enjoying Mashbuild (https://www.washmodistilling.com/mashbuild), a whisky blended over in Washington, MO. It's a bit of a gimmick, but it's a tasty gimmick. Think “Infinity Bottle”, but at the barrel-level.

It's a 100 proof whisky that's fairly dark. It's not as fiery as those that have aged for a long time. I find it smooth enough to enjoy with just a splash of water—I also have it with ice.

On the nose I get leather with hints of cinnamon stick and honey. There's some heat mid-tongue and with greater heat at the throat. Honey and cinnamon with a brief hint of licorice or anise. The mouth feel is great, almost like coffee with cream.

There are some other flavors in there, but I'm not able to pick them out at the moment.

I like this.

I'll update when I put ice in the next dram.

 
Read more...

from csantosb

img
It is possible to contribute to improving #guix as the need for new functionalities, packages, fixes or upgrades arise. This is one of the strongest points in open communities: the possibility to participate on the development and continuous improvement of the tool. Let’s see how it goes when it comes to guix.
Guix is a huge project which follows closely the #freesoftware paradigm, and collaboration works in two directions. You take advantage of other developers contributions to guix, while you participate yourself to improving guix repositories with your fixes, updates or new features, once they have been tested. In a first approach, from my own experience, one may create a personal local repository of package definitions, for a personal use. As a second step, it is possible to create a public guix channel, in parallel to contributing upstream.
Contributing your code to guix comes to sending #email with your patches attached, it’s that simple. Don't be intimidated by the details (this is used by lots of open communities, after all). Once your patches are submitted, a review of your code follows, see details. Some tools, like mumi, are helpful to that purpose.

In detail

Following the kind of contribution (new additions, fixes or upgrades), these simple steps will allow you to start contributing to guix:

git clone guix itselft
from the guix repository, do:

sh guix shell -D guix -CPW ./bootstrap ./configure make -j$(nproc) ./pre-inst-env guix build hello add and commit your changes, watch the commit message
beware your synopses and descriptions
remember to run the package tests, if relevant
check the license
use an alphabetical order in input lists
no sign off your commits
don’t forget to use lint/style/refresh -l/dependents to check your code

Boring and routinary, right ?

Use sourcehut

img
Most of all the of the previous can be run automatically with help of sourcehut build farm #ci capabilities. Just simply, push the guix repository to sr.ht. At this point, it is possible to use this manifest file to run the lint/style/refresh -l/dependents testing stages on the yosys package definition, por example:

image: guix
shell: true
environment:
  prj: guix.guix
  cmd: "guix shell -D guix -CPWN git nss-certs -- ./pre-inst-env guix"
sources:
  - https://git.sr.ht/~csantosb/guix.guix
tasks:
  - def_pkg: |
      cd "$prj"
      _pkg=$(git log -1 --oneline | cut -d':' -f 2 | xargs)
      echo "export pkg=$_pkg" >> "$HOME/.buildenv"
  - setup: |
      cd "$prj"
      guix shell -D guix -CPW -- ./bootstrap
      guix shell -D guix -CPW -- ./configure
      guix shell -D guix -CPW -- make -j $(nproc)
  - build: |
      cd "$prj"
      eval "$cmd build --rounds=5 $pkg"
  - lint: |
      cd "$prj"
      eval "$cmd lint $pkg"
  - style: |
      cd "$prj"
      eval "$cmd style $pkg --dry-run"
  - refresh: |
      cd "$prj"
      eval "$cmd refresh -l $pkg"
  - dependents: |
      cd "$prj"
      eval "$cmd build --dependents $pkg"
triggers:
  - condition: failure
    action: email
    to: builds.sr.ht@csantosb.mozmail.com

Submit the manifest with

hut builds submit # --edit

You’ll be able to log into the build farm to follow the build process or to debug it with

hut builds ssh ID

Check the log here. As you can see, it fails: building of yosys succeeds, but building of packages which depend on it (--dependents) fails.

Advanced

Sourcehut provides a facility to automatize patch submission and testing. Using its hub integrator, one may just send an email to the email list related to your project (guix in this case), which mimics guix behavior for accepting patches.
The trick here consists on appending the project name as a prefix to the subject of the message, for example [PATCH project-name], which will trigger the build of previous .build.yml manifest file at the root of the project, after applying the patch. Neat, right ?
If you followed right here, you’ll notice that previous build manifest file is monolithic, affecting always the same package (yosys), which is kind of useless, as we are here interested in testing our patch. Thus, the question on how to trigger a custom build containing an updated $pkg variable related to the patch to test remains open.
To update the contents of the $pkg variable in the build manifest, one has to parse the commit message in the patch, extracting from there the package name. This is not a problem, as guix imposes clear commit messages in patches, so typically something like

* gnu: gnunet: Update to 0.23.0

or

* gnu: texmacs: Add qtwayland-5

Hopefully, parsing these messages to get the package name, and so the value of $pkg is trivial.
Then, it remains to include in our build manifest a first task which updates the contents of "$HOME/.buildenv". This file is automatically populated using the environment variables in the manifest, and its contents are sourced at the beginning of all tasks. This mechanism allows passing variables between tasks.

echo "export pkg=value" >> "$HOME/.buildenv"

Send your contribution

Finally, once your changes go through all the tests,

use git send-email to create and send a patch
consider reviews, if any, updating your patch accordingly with git ammend
resend a new patch including a patch version (v1, v2 ...)

Interested ? Consult the documentation for details, you’ll learn a lot about how to contribute to a common good and collaboration with other people.
#ciseries

 
Read more...

from csantosb

img
Remote #ci is the way to go in #modernhw digital design testing. In this #ciseries, let’s see how to implement it with detail using sourcehut and a real world example.
Sourcehut is a lightweight #gitforge where I host my #git repositories. Not only it is based on a paradigm perfectly adapted to #modernhw, but also its builds service includes support for guix (x86_64) images. This means that we will be able to execute all of our testing online inside guix profiles, shells or natively on top of the bare-bones image.

Alu

Let’s consider now a variant of the previous example with open-logic. Here, we concentrate on a toy design only for demonstration purposes, a dummy alu emulator, which uses #osvvm as verification framework and relies on a few #openlogic blocs. In this case, its dependencies are defined in a manifest.scm file, including both fw-open-logic and osvvm, among other dependencies.
Install dependencies locally, in a new profile with

cd alu
mkdir _deps
export GUIX_PROFILE=open-logic/_deps
guix install -P $GUIX_PROFILE -m .builds/manifest.scm
. $GUIX_PROFILE/etc/profile

In this case, we will test the design using, first, a custom made makefile. Secondly, we will use hdlmake to automatically produce our makefile. Similarly to previous #openlogic example, two build manifest are used:

profile1
profile2

You’ll realise how some of the tasks are common with the case of previous #openlogic example (update channels, auth and update profile).

osvvm

In this case, we also need to compile osvvm libraries

compile__osvvm, produce a compiled version of #osvvm verification libraries; this is necessary as we are using here the tcl scripts included in the library itself to follow the correct order of compilation. Libraries will appear within the local profile under $GUIX_PROFILE/VHDL_LIBS/GHDL-X.Y.Z

test

test, for a fully custom made testing pipeline; in this case, using a Makefile
Just simply, source the .envrc file where the local $GUIX_PROFILE variable is defined, cd to the ghdl directory and call make to compile the design and run the simulation in two steps: first, clean all and include sources in its corresponding libraries with

sh make __clean_all __include

Then, produce a new Makefile using ghdl.

sh ./makefile.sh # ghdl --gen-makefile ...

Finally, run the simulation with

sh make GHDLRUNFLAGS="--stop-time=4us --disp-time --ieee-asserts=enable" run

This will produce a executable file before running it with the provided parameters.
You may notice that, in this case, you need to produce somehow your own Makefile, or equivalent pipeline, right ?

hdlmake

Wouldn’t it be nice if we had a tool to deploy online which produces makefiles for us ? It exists, and its name is #hdlmake.

test__hdlmake
Source the .envrc file where the local $GUIX_PROFILE variable is defined, cd to the .builds/hdlmake directory where all Manifest.py files are located, and call hdlmake to produce the Makefile. Finally, just run make to compile the design, produce an executable and run it.

Check the resulting logs inline here, for example.

 
Read more...

from csantosb

img
Remote #ci is the way to go in #modernhw digital design testing. In this #ciseries, let’s see how to implement it with detail using sourcehut and a real world example.
Sourcehut is a lightweight #gitforge where I host my #git repositories. Not only it is based on a paradigm perfectly adapted to #modernhw, but also its builds service includes support for guix (x86_64) images. This means that we will be able to execute all of our testing online inside guix profiles, shells or natively on top of the bare-bones image.

Open logic

Let’s see how in detail using the cookbook as a starting point, and taking as a complete example the fw-open-logic #openlogic firmware package which comes with the electronics guix channel.
Get it with:

guix install fw-open-logic:out

Open logic is a useful #vhdl library of commonly used components, implemented in a reusable and vendor/tool-independent way. As any other #modernhw library, it includes tests sets for any of its components, using the vunit utility in this case.
To run the full tests suite use (user wide using the default $GUIX_PROFILE), install its dependencies, defined in a manifest.scm file (ghdl-clang and python-vunit in this case).

cd open-logic
guix install -m .builds/manifest.scm
cd sim
python3 run.py --ghdl -v

or local to the project, using a profile

cd open-logic
mkdir _deps
export GUIX_PROFILE=open-logic/_deps
guix install -P $GUIX_PROFILE -m .builds/manifest.scm
. $GUIX_PROFILE/etc/profile
cd sim
python3 run.py --ghdl -v

go remote

img
Now, how do we proceed online using #sourcehut #ci builds facility ? Builds will pop up a new environment based on an up to date guix-system image when we push a commit to git.sr.ht, provided we include a .build.yml build manifest file, or by a .build folder with up to 4 build manifest files, at the root of the git project [1]. Be careful: consider that this image is built daily using a crontab job, which is a good and a bad thing at the same time. From one side, you won’t be using the same environment for your tests, which breaks #reproducibility (see comments section below). On the other side, #guix is a rolling release, and new fancy features and new fixes are added every day. Keep this in mind.
Let’s create a .builds folder in a topic test branch, with the following contents:

manifest.scm, list of dependencies in our project
guix.scm, default guix repository, redundant, included here for convenience
channels.scm, list of guix channels remote repositories, in addition to the default guix repository, from where we pull packages
We will be using here my own electronics channel (no substitutes), as well as the guix science channel (which provides substitutes).
(note how here we load the local guix.scm file, instead of making use of the %default-channels global variable)

scheme (load "guix.scm") ;;; %default-channels key.pub, auth key to access substitutes of packages in guix channels

build manifests

From now on, every new push to the test #git branch will trigger the execution of the tasks defined in the three build manifest files

profile1
profile2
shell1

The two profile build manifest files use a slightly different approach, and are given here for comparison purposes only. The shell build manifest uses an isolated shell container within the image itself to illustrate this feature.
Inside the manifests, I declare the image to use, guix, and the global environment variables sourced before each task is run: prj (project name), srv (list of servers with substitutes), manifest and channels (pointing to the corresponding files) and key (same). It is important to declare a trigger action, to receive an email with all relevant information in case of failure (log, id, commit, etc.).

tasks

What’s interesting here is the list of tasks. Some of them are common to all three manifests

env, useful only for debugging
guix__update__channels, replace the default project local guix.scm file by the output of

sh guix describe --format=channels

The goal here is avoid pulling latest guix upstream, useless and cpu and time consuming, and using the local version instead. Remember that the guix system image we are using here is updated daily.

guix__auth, runs the authorize command to add the key.pub file to guix, so that we will be able to download package substitutes when necessary

sh sudo guix archive --authorize < "$key"

Here, one may opt by doing a

sh guix pull --channels="$channels"

as in profile2, to set the revision of the guix channels we are using (remember channels are nothing but git repositories).
Note how in profile1 and shell1 we opt for a different approach.
guix__update__profile, where we create a _deps folder to be used as a local $GUIX_PROFILE (defined in .envrc).
Then, one of

sh # profile1 guix time-machine --channels="$channels" -- \ package -p "$GUIX_PROFILE" \ --substitute-urls="$srv" \ -m "$manifest"

or

sh # profile2 guix \ package -p "$GUIX_PROFILE" \ --substitute-urls="$srv" \ -m "$manifest"

will install packages in $manifest into the $GUIX_PROFILE. I’m using here the time-machine mechanism to set the revision of the guix channels, depending if guix pull was run in the previous stage or not.
vunit, sets env variables in .envrc and runs python3 run.py --ghdl -v inside sim directory
Note that here, we are using ghdl-clang and python-vunit packages, provided respectively by guix-science and the electronics channel.
guix__shell__test, used by shell1, make use of time-machine (no former guix pull, then), to create a shell container, where to install project dependencies. Then, if calls inmediately run.sh to run the unit tests

sh guix time-machine --channels="$channels" -- shell -C --substitute-urls="$srv" -m "$manifest" -- ./.builds/run.sh

comments

You may check the logs of profile1, profile2 and shell1 manifests, including a section with logs per task, to better understand what’s going on here. Remember that #sourcehut gives ssh access to the builds by connecting to the runners in case of failures, which provides a practical way of debugging the manifest files.
You may see how, using the remove guix image, it is possible to deploy a series of tasks to test our #modernhw design as we develop it: we will get an email in case of failure to pass the tests. Here, I present three approaches: guix pulling to set the repositories revisions on use; time-machine, to achieve the same, and guix shell to create an isolated container. These three alternatives are not necessary here, of course, but are given as a simple and practical demo of what can be achieved with #guix, #sourcehut and #ci.
To conclude this long post, it is important to stress once again that the point on using #guix resides in its reproducibility capabilities. By keeping a couple of #plaintext files, namely the manifest.scm and channels.scm, one can obtain #determinism in the execution of the tests. Even if the guix image is upgraded and rebuilt daily (and so it changes), by fixing the revision of our channels (remember, guix pull or guix time-machine) we obtain always the same products out of our tests, as we run the same (project and tests) code, within exactly the same environment.


[1] It is also possible to automatically submit builds when a patch to a repo with build manifests is sent to a mailing list. This is achieved by appending the project name as a prefix to the subject of the message, for example [PATCH project-name].

 
Read more...

from Kevin Neely's Security Notes

A resume workflow from neurond.com Image: a typical resume content extraction workflow from neurond.com

I used to keep my résumé (from here, “resume”) very up-to-date. For a long time, I had a resume crafted in #LaTeX because I have a long history with using that typesetting and markup language for purposes other than the ones most people think of, e.g. I wrote my college English papers in it, I had a slew of templates I created while I was a practicing attorney that would create letters, motions, and envelopes from source .tex files, etc. Keeping content in text makes it more portable across platforms and applications, and the nature of Microsoft Word is that you need to fully re-create the resume every couple years because some invisible formatting munges the entire document.

TL;DR I ended up using RenderCV as mentioned below in the [[Resume Workflow#RenderCV|RenderCV section]].

In the time since I last relied upon a resume, the method of applying for jobs –and more importantly, how recruiters review submissions– has changed pretty drastically. And despite all the great advances in technology over the past ten years, apparently, HR systems still are not that great at parsing a PDF or Word doc into text that can be machine-read by whatever algorithms and/or AI they’re using to perform the first pass. Because of this, you want to make sure to submit a machine-friendly description of your experience. There really should be a standard for all this stuff that makes it easy on both the applicant and the hiring manager. Like, I don’t know, some sort of HR standards body or something. A standard has never emerged, and I suspect that LinkedIn has a lot to do with that.

Additionally, having an easy way to keep one’s resume in sync and in multiple formats means that it can be quickly used for many purposes, from printing an attractive hard copy to piping it through some [[Fabric]] AI workflows. So this set me on a fairly long hunt for a system where I could write once, and generate in multiple formats.

The search for a resume workflow

First round

LaTeX & Pandoc

Since my resume was already in LaTeX, using the 20 second CV set of template –which I think is very nice– I went and updated that and then ran it through pandoc, which is a multi-format document converter. The results ended up being pretty poor and not useful. The PDF looked great, obviously, but pandoc did not understand the LaTeX very well and the Markdown required a lot of edits.

We want everything to look good upon compilation/export/save as/whatever, so this was not an option.

Interlude

I had kind of given up at this point, figuring I either needed to just go Google Docs or maintain a Markdown version and attempt to keep them in sync. Then, I came across a post about an auto-application bot and the author had a related project that used resume information formatted as YAML to create a specific resume based upon job description or LinkedIn post.

Resume from Job Description

This project is called resume render from job description (no cute animal names or obtuse references in this project!), and I gave it a try, but it appeared to require all the fields, including e.g. GPA. I don’t know about you, but I'm way past the point in my career where I'm putting my GPA on a resume, so it wasn’t that useful.

It was late on a Thursday night, so obviously it was time to look a bit further into the rabbit hole

Online options

I found a number of projects that were a service model where they host and render the resume for you. These included resume.lol (I question the naming choice here), Reactive resume (opensource, excellent domain name, and it has nice documentation), and WTF resume (my thought exactly!).

These all came from a post of 14 Open-source Free Resume Builder and CV Generator Apps.

JSONResume

As I traveled further down the Internet search rabbit hole, I came across JSON Resume, an #opensource project with a hosting component where people craft their resumes in JSON and it can then render in a number of formats either via a command-line tool or within their hosted service, making it a kind of hybrid option.

At this point, I felt like I was almost there, but it wasn’t exactly what I wanted. JSONResume is very focused around being part of their ecosystem and publishing within their hosting ecosystem. The original #CLI tool is no longer maintained, and a new one is being worked on, which appears minimal but sufficient for the task. A nice thing is that they have some add-ons and have created a sort of ecosystem of tools. Looking over the project’s 10 year history, those tools have a tendency to come and go, but such is the nature of OSS.

The Award for “Project Most Suited to My Workflow” goes to….

Another great thing about JSON Resume is that they, i.e. Thomas Davis, have done a fantastic job of cataloging various resume systems out there in their JSON Resume projects section. There is so much interesting stuff here –and a lot of duplicative effort ahem see the “HR Standards” comment above– that you can spend a couple days looking for the project that best fits your needs. For me, I landed on RenderCV, which is not only in the bibliography, but also mentioned on the Getting Started page because there are tools to leverage JSON Resume from RenderCV!

So without further ado…

RenderCV

While RenderCV is a part of the JSON Resume ecosystem, in that people have created scripts to convert from the latter to the former, it is a completely separate and standalone project. Written in #python and installable via pip. RenderCV’s approach is to leverage a YAML file, and from that generate consistent resumes in PDF, HTTML, Markdown, and even individual PNG files, allowing the applicant to meet whatever arcane requirements the prospective employer has.

graph LR

	YAML --> TeX & Markdown 
	TeX --> PDF & HTML & PNG

Resume generation workflow

Using RenderCV

Getting started with RenderCV is like pretty much any other project built in python

  1. Create a virtual environment using venv or conda, e.g. conda create -n renderCV python=3.12.4
  2. Install via pip with a simple command pip install rendercv
  3. Follow the quick start guide and create a YAML file with your information in it
  4. Run rendercv render <my_cv>.yaml
  5. View the lovely rendered résumé

Extending RenderCV

This was great, as I now have a very easy-to-edit source document for my résumé and can quickly create others. I’m hoping Sina, the author, makes the framework a bit more extensible in the future because the current templates are oriented toward people with STEM backgrounds looking for individual contributor roles. However, as some of us move further in our careers, the résumé should be less about skills and projects, but more about responsibilities and accomplishments as we lead teams. I have enhanced the “classic” and “sb2nov” themes so that they take these keywords as subsections to a specific company/role combination under the professional_experience section.

Theme update for Leaders and Managers

I created a fork which contains updates to v1.14, adding the “Responsibilities” and “Accomplishments” subsections for company: under the Experience section.
This allows leaders to craft their resume or CV in such a way that it highlights the breadth of their influence and impact to the organization.

The following themes support the additional subsections: – markdown – classic – sb2nov

A non-updated theme will simply ignore the content under these subsections; omitting these sections will make the resume look like the original theme. Hopefully the framework will be more extensible in the future and I can add this as a pull request.
In the meantime, the forked repo at https://github.com/ktneely/rendercv4leaders should work on its own, or the /ExperienceEntry.j2.tex and /ExperienceEntry.j2.md files from those themes can simply be copied over the existing.

How to use

Usage is extremely straightforward, as this merely extends the framework with a couple new keywords for the Experience section and looking for a preceding company declaration. Here is an example:

professional_experience:
  - company: NASA
	position: Director of Flight Operations
	location: Houston, TX
	start_date: 1957-03
	end_date: 1964-06
	responsibilities:
	  - Manage the Control room.
	  - Write performance reports.
	  - Smoke copious amounts of cigarettes
	accomplishments:
	  - 100% staff retention over the course of 9 rocket launches.
	  - Mobilized and orchestrated multiple teams to rescue astronauts trapped in space.
	  - Lung cancer.

This will then render “responsibilities” and “accomplishments” as italicized sections under the job role, highlighting what a difference made while performing in that role.

Maintaining Multiple Versions

This is basically what it all comes down to: the ability to maintain different versions for your target companies. While some work is being done to modularize the source content, it is not yet to the point where each section of the resume is a building block that can be invoked at compile time. What I do is maintain different YAML files and use the parameters in the rendercv_settings section to direct the output to different, meaningfully-named directories while maintaining a generic name for the file itself.

So, instead of “Kevin-LargeCorprole.pdf”, “Kevin-Startuprole.pdf”, etc., I simply send “Kevin-CV.pdf”. This way, it’s not incredibly obvious to the reviewer that I have specially-crafted a resume for that job, it just happens to look like I have exactly what they’re looking for in my default resume.

Automation

Want to automate the build of your resume whenever you update the source file(s)? Look no further than rendercv pipeline to generate the output whenever you commit source to GitHub.

Also, since version 1.15, the --watch flag will watch the source file locally and re-compile every time you save the source YAML file.

References and further exploration

  1. Neurond.com blog post: What is a CV/Resume Parser and How Does it Work?, Trinh Nguyen, Aug 16, 2022.
  2. TeXMaker: an Open-source TeX editor
  3. RenderCV user guide
 
Read more...

from csantosb

img
Remote #ci is the way to go in #modernhw digital design testing. In this #ciseries, let’s see it in practice with some detail using two of the most popular forges out there.

Gitlab

The gitlab #gitforge includes tones of features. Among these, a facility called the container registry, which stores per project container images. Guix pack allows the creation of custom #reproductible environments as images. In particular, it is possible to create a docker image out of our manifest and channels files with

guix time-machine -C channels.scm -- pack --compression=xz --save-provenance -f docker -m manifest.scm

Check the documentation for options.
Remember that there are obviously alternative methods to produce docker images. The point on using guix resides on its reproducibility capabilities: you’ll be able to create a new, identical docker image, out of the manifest and channels files at any point in time. Even more: you’ll have the capacity to retrieve your manifest file out of the binary image in case your manifest file gets lost.
Then, this image must be loaded into the local docker store with

docker load < IMAGE

and renamed to something meaningful

docker tag IMAGE:latest gitlab-registry.whatever.fr/domain/group/NAME:TAG

go remote

img
Finally, pushed to the remote container registry of your project with

docker push gitlab-registry.whatever.fr/domain/group/NAME:TAG

At this point, you have an environment where you’ll run your tests using gitlab's ci features. You’ll set up your gitlab’s runners and manifest files to use this container to execute your jobs.
As an alternative, you could use a ssh executor running on your own fast and powerful hardware resources (dedicated machine, shared cluster, etc.). In this case, you’d rather produce an apptainer container image with:

guix time-machine -C channels.scm -- pack -f squashfs ...

scp this container file to your computing resources and call it from the #gitlab runner.

Github

The github is probably the most popular #gitforge out there. It follows a similar to #gitlab in its conception (pull requests and merge requests, you catch the idea ?). It also includes a container registry, and the set of features if offers may be exchanged with ease with any other #gitforge following the same paradigm. No need to go into more details.
There is a couple of interesting tips about using #github, though. It happens more usually than not that users encounter frequently problems of #reproducibility when using container images hosted on ghcr.io, the hosting service for user images. These images are usually employed for running #ci testing pipelines, and they usually break as upstream changes happen: updates, image definition changes, image packages upgrades, etc. If you read my dependencies hell post, this should ring a bell.
What can be done about in what concerns #modernhw ? Well, we have #guix. Let’s try a differente approach: building an image locally, and pushing it to #github registry. Let’s see how.

in practice

An example repository shows tha way to proceed. Its contents allow to create a docker container image to be hosted remotely. It includes all that’s necessary to perform remote #ci testing of a #modernhw #vhdl design.

docker pull ghcr.io/csantosb/hdl
docker images # check $ID
docker run -ti $ID bash

It includes a couple of #plaintext files to produce a #deterministic container. First, the channels.scm file with the list of guix chanels to use to pull packages from. Then, a manifest.scm, with the list of packages to be install within the container.
The image container may be build with

image=$(guix time-machine --channels=channels.scm -- \
             pack -f docker \
             -S /bin=bin \
             --save-provenance \
             -m manifest.scm)

At this point, it is to be load to the docker store with

docker load < $image
# docker images

Now it is time to tag the image

docker tag IMID ghcr.io/USER/REPO:RELEASE

and login to ghcr.io

docker login -u USER -p PASSWORD ghcr.io

Finally, the image is to be push remotely

docker push ghcr.io/USER/HDL:RELEASE

test

You’ll may test this image using the neorv32 project, for example, with:

docker pull ghcr.io/csantosb/hdl
docker run -ti ID bash
git clone --depth=1 https://github.com/stnolting/neorv32
cd neorv32
git clone --depth=1 https://github.com/stnolting/neorv32-vunit test
cd test
rm -rf neorv32
ln -sf ../../neorv32 neorv32
python3 sim/run.py --ci-mode -v
 
Read more...

from Ducks

From 49.12.82.250 to 195.201.173.222 Lots of domains moved , both ips in Hetzner space. Many of the domanis are fake crypto investing sites #cryptoscam. And other scam sites.

 
Read more...

from Бележник | Notеs

R hslfow szev nvmgrlmvw yvuliv, gszg, rm gsv zfgfnm lu gsv kivxvwrmt bvzi, R szw ulin'w nlhg lu nb rmtvmrlfh zxjfzrmgzmxv rmgl z xofy lu nfgfzo rnkilevnvmg, dsrxs dzh xzoovw gsv Qfmgl; dv nvg lm Uirwzb vevmrmth. Gsv ifovh gszg R wivd fk ivjfrivw gszg vevib nvnyvi, rm srh gfim, hslfow kilwfxv lmv li nliv jfvirvh lm zmb klrmg lu Nlizoh, Klorgrxh, li Mzgfizo Ksrolhlksb, gl yv wrhxfhh'w yb gsv xlnkzmb; zmw lmxv rm gsivv nlmgsh kilwfxv zmw ivzw zm vhhzb lu srh ldm dirgrmt, lm zmb hfyqvxg sv kovzhvw. Lfi wvyzgvh dviv gl yv fmwvi gsv wrivxgrlm lu z kivhrwvmg, zmw gl yv xlmwfxgvw rm gsv hrmxviv hkrirg lu rmjfrib zugvi gifgs, drgslfg ulmwmvhh uli wrhkfgv, li wvhriv lu erxglib; zmw, gl kivevmg dzings, zoo vckivhhrlmh lu klhrgrevmvhh rm lkrmrlmh, li wrivxg xlmgizwrxgrlm, dviv zugvi hlnv grnv nzwv xlmgizyzmw, zmw kilsryrgvw fmwvi hnzoo kvxfmrzib kvmzogrvh.

 
Read more...

from Бележник | Notеs

“Както водата, газта и електричеството идват отдалеч в нашето жилище с помощта на почти незабележимо движение на ръката, за да ни обслужат, така ще бъдем снабдявани с картини или с поредици от тонове, които ще се появяват с помощта на едно леко движение, почти знак, и също тъй ще ни напускат.”

 
Read more...

from Kevin Neely's Security Notes

I finally decided to move my #NextCloud instance from one that I had been operating on the #Vultr hosting service to my #HomeLab.

A note on Vultr: I am impressed with this service. I have used them for multiple projects and paid with various means, from credit card to #cryptocurrency for about 10 years and I cannot even remember a downtime that impacted me. (In fact, I think there was only one real downtime, which was planned, well-communicated, and didn’t impact me because my setup was fairly resilient). With a growing volume of data, and sufficient spare hardware that wasn’t doing anything, I decided to bring it in-house.

This is not going to be a full guide, as there are plenty of those, but I did run into some hurdles that may be common, especially if a pre-built Nextcloud instance was used. So this is meant to provide some color and augment the official and popular documentation.

Getting started

Plan out the migration

Migration Overview

Essentially, there are three high-level steps to this process 1. Build a new Nextcloud server in the homelab 2. Copy the configuration (1 file), database (1 backup file), apps (install apps), and data (all user files) over to the new system 3. Restore all the copied data to the new instance

Preparing to Migrate

  1. Start with the NextCloud official documentation for migrating to a different server as well as:
    1. Backing up Nextcloud
    2. and the restoring a server doc
  2. Check out Nicholas Henkey’s migrate Nextcloud to a new server blog post. This is very thorough and has some great detail if you’re not super familiar with Nextcloud (because you used a pre-built instance)
  3. For the new build:
    1. A full set of installation instructions, placing [Nextcloud behind an Nginx proxy](https://github.com/jameskimmel/Nextcloud_Ubuntu/blob/main/nextcloud_behind_NGINX_proxy.md.
    2. An older install document for Installing Nextcloud on Ubuntu with Redis, APCu, SSL & Apache

Migration

While the official documentation describes the basics, the following is the steps I recommend following. This is at a medium level, providing the details, but not the specific command-line arguments (mostly).

  1. Build the new server
    1. Use your favorite flavor of Linux (I used Debian, and these notes will reflect that)
      1. install all updates,
      2. install fail2ban or similar security if you’re exposing this to the Internet.
      3. name the new system the same as the outgoing server
    2. Download the Nextcloud install from the nextcloud download site and choose either:
      1. update the current system to the latest version of whatever major version your running, and then download latest-XX.tar.bz2 where ‘XX’ is your version
      2. identify your exact version and download it from nextcloud
    3. Install the dependencies (mariaDB, redis, php, apache, etc. etc.)
      1. note: if the source server is running nginx, I recommend sticking with that for simplicity, keeping in mind that only Apache is officially supported
    4. Unpack Nextcloud
    5. Validate that it’s working
    6. Place it into maintenance mode
  2. Backup the data

    1. If using multi-factor authentication, find your recovery codes or create new ones
    2. Place the server into maintenance mode
    3. Backup the database
    4. copy the database backup to a temporary location on the new server
  3. Restore the data

    1. Restore the database
    2. copy /path/to/nextcloud/config/config.php over the existing config.php
    3. rsync the data/ directory to the new server
      1. you can remove old logs in the data directory
      2. you may need to use an intermediary step, like a USB drive. It’s best if this is ext4 formatted so you can retain attributes
      3. the rsync options should include -Aaxr you may want -v and/or --progress to get a better feel for what’s going on
      4. if rsync-ing over ssh, the switch is -e ssh
    4. If you have installed any additional apps for your Nextcloud environment, rsync the apps/ directory in the same way as the data dir above
    5. Validate the permissions in your nextcloud, data, and apps directories. Fix as necessary, see the info Nicholas Henkey’s post (linked above) for commands
    6. Redirect your A or CNAME record to the new system
    7. Configure SSL on the new system
    8. Turn off maintenance mode
    9. Log in and test! :fingers-crossed:

Troubleshooting

Hopefully everything is working. Make sure to check the logs if something is broken.

log locations – the nextcloud.log in the data/ directory – the apache logs in /var/log/apache2 – the redis logs in /var/log/redis – the system logs, accessible with journalctl

Reiterating: Remember or check for these items

These are the specific notes I took as I ran into problems that I had to work around or solve. These are incorporated in the above, so this is basically a restatement of the gotchas I ran into:

  • upgrade the current one to the latest version of the current release (i.e. the latest of the major version you are on, so if you were running 29.0.3, get to 29.0.9)
    • this makes it easier when you download <version>-latest.tar.bz2
    • If you’d prefer to skip that, use the nextcloud download site with all available versions. Make sure to grab the same one and compare the specific version as listed in config.php. Example: 'version' => '29.0.9.2',
  • use the same name on the new server
  • use the same web server. Apache is officially supported, but if you’re using nginx, it will be easier to stay on that.
  • Most multi-factor authentication, like WebAuthN, FIDO hardware keys, etc. will not work over HTTP in the clear.
    • IOW: make sure you have recovery codes
  • If the apps aren’t copied over, the new server sees them as installed rather than installable. I suppose one could “delete” or remove them in the admin GUI and then reinstall, but otherwise, there was no button to force a reinstall.
  • Files and data you need to copy over after creating the install. Do each of these separately, rather
    • if you have any additional apps, copy the apps/ directory over
    • copy config.php
    • copy the data/ directory
  • Is your current install using Redis-based transactional file locking?
    • If the previous system was using Redis and it is still in the configuration, the new system will not be able to obtain file-locking and essentially all users will be read-only and not able to modify or create new files.
    • In config.php, you will see settings such as 'redis' and 'memcache.locking' => '\\OC\\Memcache\\Redis',
    • make sure Redis is installed on the new system and running on the same port (or change the port in config.php)
    • Install the necessary software: apt install redis-server php-redis php-apcu
    • Ensure that the Redis and APCu settings in config.php are according to the documented single-server settings

The Memcache settings should look something like the following configuration snippet. Alternatively, you could enable and use the process socket.


'memcache.local' => '\OC\Memcache\APCu',
'memcache.distributed' => '\OC\Memcache\Redis',
'memcache.locking' => '\OC\Memcache\Redis',
'redis' => [
     'host' => 'localhost',
     'port' => 6379,
],
 
Read more...

from Kevin Neely's Security Notes

Nextcloud administration notes

These instructions and administrative notes were written for the pre-built Nextcloud provided by hosting provider Vultr. As a way to de- #Google my life and take back a bit of #privacy, I have been using a Vultr-hosted instance for a couple years now and it has run quite well. These notes are really aimed at the small instance for personal use. Please don’t use my notes if you’re responsible for an enterprise server!

Upgrading Nextcloud

#Nextcloud, with all it's PHP-based functionality, can become temperamental if not upgraded appropriately.  These are my notes to remind me how to now completely break things. When upgrading, the first pass will usually bring you to the most up-to-date version of Nextcloud in your major release, e.g. an instance running 27.1.4 would be brought up to 27.1.11. Running the script again would bring the instance to 28.0.x.

To update a Nextcloud server running on the #Vultr service to the latest version, you need to follow the steps below:

  1. Backup your Nextcloud data: Before starting any update process, it's always a good idea to create a backup of your Nextcloud data. This will ensure that you can restore your data in case of any unexpected issues during the update process.
    1. Shutdown the OS with shutdown -h now
    2. Power down the instance in Vultr
    3. Create a snapshot
    4. Wait
    5. Wait some more – depending on how much data is hosted on the system
    6. Power it back up
  2. SSH into the Vultr server: To update the Nextcloud server, you need to access the server using SSH. You can use an SSH client such as PuTTY to connect to the Vultr server.
  3. Switch to the Nextcloud user: Once you are logged in, switch to the Nextcloud user using the following command: sudo su -s /bin/bash www-data.
  4. Navigate to the Nextcloud directory: Navigate to the Nextcloud directory using the following command: cd/var/www/html  (could be /var/www/nextcloud or other.  Check what's in use)
  5. Stop the Nextcloud service: To avoid any conflicts during the update process, stop the Nextcloud service using the following command (as www-data): php occ maintenance:mode --on 
  6. Update the Nextcloud server: To update the Nextcloud server, you need to run the following command(as www-data): php updater/updater.phar. This will start the update process and download the latest version of Nextcloud.
  7. Update the OS, as needed, with apt upgrade
  8. Start the Nextcloud service: Once the update is complete and verified, you can start the Nextcloud service using the following command: sudo -u www-data php occ maintenance:mode --off.
  9. Verify the update: After the update process is complete, you can verify the update by accessing the Nextcloud login page. You should see the latest version of Nextcloud listed on the login page.
  10. Assuming all is running smoothly, the snapshot that was created in step 1 can be safely deleted. Otherwise, they accrue charges on the order of pennies / gigabyte / day.

Some other notes

Remove files in the trash

When a user deletes files, it can take a long time from them to actually disappear from the server.

root@cloud:/var/www/html# sudo -u www-data php -f /var/www/html/cron.php root@cloud:/var/www/html# sudo -u www-data php occ config:app:delete files_trashbin background_job_expire_trash

Set files to expire

root@cloud:/var/www/html# sudo -u www-data php occ config:app:set —value=yes iles_trashbin background_job_expire_trash

 
Read more...

from Sirius

O historiador grego do século I a.C., Diodoro, é considerado um compilador de fontes antigas, dentre elas alguns dos ensinamentos de Demócrito de Abdera. Em sua obra, Biblioteca de História (Tomo I, Capítulo 8), encontramos um relato da origem dos seres vivos e dos primeiros homens, que são atribuídos aos ensinamentos de Demócrito por especialistas como Diels, Vlastos, Reinhardt e Beresford. Dando início a meus estudos sobre Protágoras que, como discípulo de Demócrito, compartilhava com ele algumas concepções naturalistas e humanistas, apresento uma tradução do relato da pré-história de Diodoro. Felizmente a obra Biblioteca de História, de Diodoro, foi disponibilizada em inglês pela Universidade de Chicago nesse site.

Transcrevo a seguir o relato dos primeiros homens de Diodoro, como texto inicial para o estudo da conexão do pensamento de Demócrito com o de Protágoras (inclusive as semelhanças e diferenças com o mito de Prometeu e Epimeteu, atribuído a Protágoras no diálogo homônimo, de Platão):

Relato da pré-história de Diodoro

(…) os primeiros homens a nascer (…) levavam uma vida indisciplinada e bestial, saindo um a um para garantir sua subsistência e alimentando-se tanto das ervas mais tenras quanto dos frutos das árvores selvagens. Então, como foram atacados pelas feras, vieram em auxílio uns dos outros, sendo instruídos pela necessidade, e, quando se reuniram dessa maneira devido ao medo, gradualmente começaram a reconhecer suas características mútuas. E embora os sons que produziam fossem no início incompreensíveis e indistintos, aos poucos conseguiram articular sua fala, e, ao concordar entre si sobre símbolos para cada coisa que se apresentava a eles, tornaram conhecido entre si o significado que deveria ser atribuído a cada termo. Mas, como grupos desse tipo surgiram por todas as partes do mundo habitado, nem todos os homens tinham a mesma linguagem, uma vez que cada grupo organizou os elementos de sua fala por mero acaso. Esta é a explicação da existência atual de todos os tipos concebíveis de linguagem e, além disso, a partir desses primeiros grupos formados surgiram todas as nações originais do mundo.

Agora, os primeiros homens, uma vez que nenhuma das coisas úteis para a vida havia sido descoberta ainda, levavam uma existência miserável, não tendo roupas para se cobrir, não sabendo o uso de habitações e fogo, e também sendo totalmente ignorantes de alimentos cultivados. Pois como também negligenciaram até mesmo a colheita dos alimentos selvagens, não acumularam nenhum estoque de seus frutos contra suas necessidades; consequentemente, um grande número deles pereceu nos invernos devido ao frio e à falta de alimentos. Pouco a pouco, no entanto, a experiência os ensinou tanto a buscar as cavernas no inverno quanto a armazenar os frutos que podiam ser preservados. E quando se familiarizaram com o fogo e outras coisas úteis, as artes também e tudo o que é capaz de promover a vida social do homem foram gradualmente descobertos. De fato, falando de modo geral, em todas as coisas foi a própria necessidade que se tornou a professora do homem, fornecendo de maneira apropriada instrução em todos os assuntos a uma criatura que foi bem dotada pela natureza e que tinha, como assistentes para todos os propósitos, mãos, logos (razão) e anchinoia (sagacidade mental).

E no que diz respeito à primeira origem dos homens e seu modo de vida mais primitivo, nos contentaremos com o que foi dito, uma vez que desejamos manter a devida proporção em nosso relato.

#Filosofia #Demócrito #Protágoras

 
Leia mais...

from Tai Lam in Science

I need to figure out how to reasonably deal mail and deliveries privately.

How it started

I donated to a local nonprofit in 2024, and I really shouldn't say this, but I honestly wish I never did. However, this is not due to a reason you probably expect.

I started to receive significantly more junk mail from charitable nonprofits and groups, more so than usual (at least since the 2020 COVID-19 pandemic). I won't name specific names, but this was a local nonprofit which has a total annual budget size between the order of $1 million and $10 million.

(To the reader: if we know each other IRL, then I'll tell you who the offending org is; and if your savvy with implementing an actionable fix with the issue below, then maybe we can work out a way for me to get out of this rut of a “situation” — as if this is or should be by highest priority project to take on right now. Let's just say that some of you will be surprised by the org I have in mind, which either intentionally uses the services of data brokers, or at least has some heuristic workflow that is leaking donor info to data brokers. The overall situation has a bit of a tragic irony.)

I'm (usually) not a vengeful person, at least when it comes to nonprofit orgs genuinely acting in good faith; but I am keeping a running list of these others orgs that engage in buying/selling/sharing snail mail lists as orgs I won't donate money to in the future, due to their respective disregard for mail privacy. However, there are 3 national-level orgs that have (so far) never sold out to physical mail lists: the ACLU, including state chapters; the EFF; and the Freedom of the Press Foundation. I am purposefully excluding comparatively technical groups that would respect the privacy and security of others in general, such as the Signal Foundation and The Tor Project.

On the other hand, the only other way to avoid excessive physical mail list tracking is to donate to small local nonprofits. (Any method is fine — if you're super concerned about protecting your membership info, using a PO box for your mailing address and renewing your member dues via paper check is more than sufficient for most local community members.) This is because these groups literally don't have the money to spend for mass mail solicitations or blanket marketing.

After this happened, I expressed to a local activist about how I'm going to go straight for a paid plan on Privacy.com (at least the lower tier) and skip the free plan. Additionally, I commented that I reaction was essentially the “I can't believe you've done this” meme. (Somehow, I was initially confused this with the “Charlie bit my finger” meme.)

How it's going (and the future)

I no longer think it's safe for me to order computers and ship the delivery to my residential address, using my own debit card. (That does remind me – I really should get a credit card for better payment protection and everything else that encompasses.)

I remembered that I ordered the HP Dev One in 2022 and the box's outer shipping box wasn't even taped closed when it arrived on my doorstep. Due to my living situation since 2020, I no longer trust anything that goes through the mail, and after Andrew “bunnie” Huang's assessment of overall supply chain security after the 2024 exploding pager incident in Lebanon, I think it's about high time I figure out the logistics of shipping to a private mail box (PMB) – or maybe I use a friend's address and/or credit card to purchase an online only computer (while I pay my friend for the cost, of course).

However, quite a few large computer manufacturers, who primarily have B2B (business-to-business) though also some minor B2C (business-to-consumer) sales, will tell customers that sending deliveries to a PO Box is not allowed during checkout. This includes Lenovo, HP, and even Framework. (I have to double check for System76.) This is partly why I was sad when Costco no longer sold any in-store ThinkPad laptops anymore (one probable cause might be the pandemic, but that's another matter).

If you have any somewhat serious considerations to become a Linux distro maintainer or even a package manager (such as the AUR/MPR), you should at least consider this while threat modeling. I recall Ariadne Conill tweeting about how a Lenovo ThinkPad laptop that they tried ordering online was suspiciously redirected to Langely, Virginia while en route to their home in early 2022, which was symptomatic of mail interdiction. However, those tweets were deleted around late 2022 or early 2023.

 
Read more...