Bitcoin and crypto’s issues in general stem mainly from an ill-informed model of how the system should work. Fractional reserve banking was never and can never be a functional model for bitcoin.
Thinking of Bitcoin as literal in your hand gold would, in my less-than humble opinion, significantly clarify how Bitcoin should be used. It’s not meant to be an interest bearing instrument such as a CD or interest-bearing savings account. It’s only ever usable purpose is a store of value over time. How that value is set depends only on what people are willing to trade those “coins” for in the real world.
Bitcoin’s biggest fault is the expectation an average bitcoin holder assumes for those “assets.” Most often they’re purchasing Bitcoin only in the hope/expectation that it will increase, drastically increase, in value. Almost like playing the lottery. Yes some will win, most will fail to find anything of value.
Using what I’m going to call the gold model we can start to solidify the intended and only sustainable use-case.
I recently started working with a new company and am rapidly becoming more familiar with GMail and it’s “assumptions” I wrote this feedback message to them:
What has happened over the last several years? Gmail used to actually be an effective tool in my daily work tasks. Now it’s literally better to have a local copy of my inbox and do sync.When I’m in a search that has is:unread and I click an email, don’t update the rowList of emails to now remove the previously unread email. (Mark it as read, create a fake row “clone” of the existing message that is marked as display anyway and that will be the new line item). Organizing the mailbox has become nearly impossible to do on Any GMail branded interface. The mobile app on Android is absolute garbage for productivity. Stop using “labels” and put back in normal folders. I want the option to have exclusive (this lives here and only here, but also might have a tag relating to this project) Inbox rules shouldn’t be labeled with the skip inbox nonsense, just put Move to folder: <drop down> and be done with. Adding labels is a cool thing, but it’s not the end-all-be-all of organization.
I’m unlikely to see a reply to this but it started some wheels turning. Namely what led to this situation? Was it some “Brilliant” Front-End Engineer who thought tagging would be the way to the future or was this due to a technical decision on the backend architecture (Database redesign, updates, etc)?
I’ve been drinking coffee my whole life. Some would say I started way to early. Along the way I’ve perfected? the art of making coffee in a drip coffee maker. Without more than a glance at the consistency of the grounds I can tell you how much coffee is too much to put in the drip basket.
This brings me to the real reason for this article. When did making coffee in a drip-coffee maker, or, making coffee in general become such a skill? I work for someone almost twice my age who can’t make coffee without either using too small a filter, or too many grounds. This results in having coffee full of grounds, or coffee sludge all over the counter when I come in. Although my biggest pet peeve is probably him leaving the coffee grounds open.
Great coffee shouldn’t be a mystery. It should be basic Drip Etiquette.
For those who, unlike me, follow every single Satoshi’s worth you may have noticed a steady but marked climb in the price of bitcoin over the last 2+ years. I find this interesting as it has outpaced the growth in other “value holding” assets such as Gold or in many cases Real Property.
I have a theory that a large number of consumers have turned to the growth of bitcoin as a hedge against rising prices generally. And some, although fewer, have used it as a high return savings account hoping for a time when interest rates would again fall to more natural levels.
Given the (we can argue on how likely it is to happen) expectation that The U.S. Federal Reserve will lower interest rates for the first time in around a year sometime this month (September 2024) financed assets should become less expensive month to month. I believe that a large amount of Bitcoin being held as an inflation hedge will be liquidated and poured back into real assets.
This leads indirectly to a conclusion that we will see at least a 20% price correction in the market if rates continue dropping, starting at 250bp/qtr for the next year finally leveling off around 3.74% around this time next year. This will drastically lower monthly payments and for those who see it coming this will be a clear signal to collateralize some of their BTC gains from the last 2 years. This money will move into the financing of downpayments for mortgaged properties ultimately re-stabilizing the Real Property market after a marginal, say 8% price correction coming around Oct 10th. This price correction will signal a short but bitter bidding bonanza from private equity and HNWI to buy up the available assets.
After this massive “buy-back” the Real asset prices will again climb up around 14% over the next 18-24 months.
What does this mean for BTC HODLer’s? Well it means the opportunity to buy-the-dip yet again.
TL;DR, internet rando writing nonsense, move along.
P.P.S, not advice, not your friend, not your advisor.
I’ve lived nearly all my life in the South Eastern Region Interconnection (SERC) and my family has been in the region since before the dams were constructed which now feed power systems in the region. I have an extremely high level of respect for the people who operate out national and local grids and welcome any of them to comment their thoughts the article.
Setting the Stage
I went back to my hometown for Christmas 2022. During this time of record cold temperatures I was staying in a rental cabin where I didn’t have access to the typical Winter preps people in the region have been making for years (wood fire place, propane or kerosene heaters, non-electric heat). I was prepared for potentially breaking down during my drive and being stranded overnight or for several hours but I wasn’t prepared for what the weather might bring after I arrived at my lodging for the next several days.
Unbeknownst to me the cabin I rented did not have any auxiliary heat to support the central heat, only two split units which were running at full blast when we arrived and still couldn’t heat the cabin to the set point. Theoretically the heating of this cabin should’ve been a non-issue but several factors led to the Split Unit’s inability to adequately heat the space: 1) Too little, or no, insulation in the floor 2) Proximity to a creek behind the property 3) Still installed ducting that ran under the cabin conducted cold air inside 4) Generally poor insulation – all contributed to reducing the split unit’s effectiveness. These poor building factors along with the near 0 Fahrenheit temperatures combined became too much for the system to overcome.
Thankfully the cabin did have an electric space heater. If it weren’t for the space heater I don’t know what we would’ve done. Likely I would’ve wound up at another location for the night. We hauled the oversized space heater into the bedroom and closed up the doors for the night. I set the space heater to 68 or 70 degrees put on some warm clothes and crawled into bed for the night.
Outage?
At around 4:00 AM I woke up and noticed that the power seemed to be out. I figured a tree had fallen or someone had run off the road and taken out a power pole. This type of thing was quite common when I was growing up in the region (power going out when it’s cold). I didn’t think much of it and figured the power would come back on soon. I think the power was probably out for 40 minutes this time. I was pleasantly surprised when the power came back on quicker than expected. I jumped out of bed and turned back on the space heater to try and recover the several degree drop that occurred during the blackout. I got back in bed but failed to fall asleep due to the lack of heat in the room. I was only in bed for maybe 15 minutes before I gave up and went to the other room to let my partner sleep.
At this point it was probably around 5:50 or so. I decided to start making some coffee in the hopes that might warm me up since the space heater and the split units could not do the job. I turned on the coffee pot and it was almost as if I had tripped out a breaker as the power went out. I guessed based on intuition that the local utility was being forced to implement power cuts at the direction of TVA.
As expected, about 20 minutes later my power came back on. I made my coffee and tried to re-warm the house. The space heater was keeping my partner warm but I had to put on a couple more layers of clothes to keep warm in the probably 58 degree living room. The split unit simply couldn’t recover the temperature in the cabin.
After getting some hot coffee in my system I started looking for confirmation that my assumptions were correct. I found that confirmation in the form of a post from our local Utility provider. They mentioned that in accordance to the ELCP step 50 local power providers were being asked to reduce their system load by 10%. To most people that doesn’t seem like much and ideally wouldn’t be very hard to achieve. However, this does ignore a lot about our aged infrastructure and the level of granular control a local operator typically has over the distribution of electricity. All substations are three-phase powered. This means that from the sub-station out to the low voltage system there is limited control of the loads. Each station can only turn on or off each of the output phases from the medium voltage transmission lines. These single phase lines are what would typically power the end users of a local grid. What’s interesting about these types of rolling blackouts for the local population is the oddity that occurs closer to the substations.
“[My neighbors across the road have power but I don’t]” was something I read several times while following the situation on social media. The local power companies typically cycle through each phase turning off one phase at a time per substation. So the main circuits typically within the city limits and where the critical infrastructure exists are prioritized, as they must be, to ensure the continued effectiveness of essential services like water, sewer, sheriff and emergency services. As the substations get further and further from the main feedpoint of a system the loads under control get less and less dense. So the main substation turning off a single phase would be about 33% of the system load for the utility. Where as the substations further out from the main feedpoint control a smaller part of the load cutting power to a phase from an edge feedpoint is still about 33% of the power for a given substation but for the overall system it’s much lower. I’m going to assume that outside the city limits accounts for only 60% of the system load whereas the inner city limits account for about 40% of the load. If we assume a single substation feeds that remaining 60% load cutting a single phase would be about (2/3 * 2/5) or about 13% of system load. typically there are several substations that operate as high voltage feed points and probably about 5:2 medium voltage substations for each feedpoint substation. So for every 2 high voltage feedpoint stations there would be 5 medium voltage substations. This depends on local population changes and topography as much as electrical demand.
So for 3 of the medium voltage substations a different phase would be cut. It might look something like this:
Station ID
Phase1
Phase 2
Phase3
1
off
on
on
2
on
off
on
3
on
on
off
4
on
on
on
5
on
on
on
Possible load curtailment implementation for a given 15-20 minute period.
If you look above there are two substations which remain completely powered. This might be due to critical infrastructure requirements or simply that curtailment at only 3 stations in the system is sufficient to meet the requirements for a given curtailment step/phase. This is really up to the emergency planning and ground situation for the local and regional power distributers.
As I was passing time early Christmas morning I was curious what the situation looked like for the hardworking individuals actually trying to prevent a system collapse. I found a post from TVA indicating that their power demand was at a record high, over a 100 megawatts higher instantaneous power demand from the previous record set just hours prior. The instantaneous demand was only half the story though. The amount of power delivered was measured in Gigawatt hours of more energy delivered over a 2-day period than the previous records. For those unfamiliar with power delivery, that’s a shitload of power. Imagine a large city of about 700,000 people just popping into existence and you’d get an average load of about 1100 mega-watts (assumed a ~1500 average load). TVA announced new record energy delivery in the days following Christmas.
As most with a smart phone I often get news suggested to me from my device. More and more often I’ll hit a paywall when trying to read the full article. This, this, paywall fiasco is why so many people are more and more willing to trust unreliable sources these days. That’s all the _news_ they have reliable access to.
At first I thought, oh it’s not a big deal and will be good for Journalism when I saw more and more news organizations implementing a “free tier” which consisted of a fixed number of free articles over a rolling period. Now almost every single time I attempt to view a news article I’m hit with the complete inability to consume the information due to the organization’s uncaring attitude around flowing information.
More on this later but Just wanted to post something.
Is anyone else tired of having to email notes back and forth, or having a nice looking but effectively unused cross-platform note taking app?
I was introduced to Drafts a couple years ago by a work colleague of mine and immediately fell in love with it. I had tried several services over the years and there was something limiting about almost all of them (Drafts included) but what Drafts does it does better than any note app I’ve ever encountered. It’s right in the name, this app is for your initial brain-dump of text into a single place. This place is synced across all your (apple) devices and for the most part everything just works and works well.
Around this same time I was building my own notes taking app to be a cross-platform desktop application to accomplish this same task. It was called java-journal in it’s first release. It worked well enough but it never left beta as doing cross platform file storage (put this folder in a default location for the operating system) led to a level of annoyance I had not expected to encounter. While there are several ways this could be improved I’ve all but abandoned the project.
I’m now planning to move to a “cloud” based solution to this problem. At the very least I want to be able to take the dumbest browsers and write a note reliably. As a side goal I’d like to be able to accomplish this task offline in addition to doing it while connected.
Efficient cross platform development seems so out of reach. I’ve yearned to create cross platform applications since college, but recently the dev bug has gotten ahold of me. Due to all the systems I regularly use: Main desktop: Linux, Work: Windows, Mobile: macOS; I really only want to write a piece of software once and use it across all my computers. The obvious choice is Java given I have nearly ZERO experience working with JavaScript and, electron and javascript as an application platforms like electron seems really bloated to me.
However, I have come to realize that Java apps lack beauty if you will. I have been using Macs since Freshman year at Uni and have come to appreciate Apple’s general program ascetic. Given those details I’ve set out to create pretty cross-platform application.
Originally this was known simply as java-journal in my code repos and internal notes. It was really quite a simple application when I initially achieved a useable development state.
JLite 0.0.0
The interface is quite simple at this point. This initial GUI is as much about verifying the functionality of the sqlite backend and the underlying “entry” logic as it was about making a truly pretty creation.
Once I had a couple people check it’s functionality (and fixed more than a few bugs) I turned to making it pretty.
I’m still working on making this pretty, but I want to get this post out so I can get feedback/opinions.
I’ve almost always been told this is too hard a problem to solve, so why even try. I never really know how to respond to this reaction. Usually I respond to this type of quip with one of two assumptions: they’re a lazy person, I don’t know what I don’t know.
When this problem was first presented at work I was leaning in the direction of simply writing a realtime kernel to run, instead of the linux kernel, which would handle all the required hardware for the project. My will rapidly deflated following this option. I rather quickly realized the Pi isn’t quite like any other Single Board Computer (SBC) I had used up to this point.
One major difference on a pi from nearly all other platforms come from the fact that the main processor is one of the last components to be initialized. On the Pi the graphics “co” processor (GP) is the first peripheral to startup which loads external code and executes it. The graphics processor identifies which version of the Pi it is and selects the corresponding kernel binary from the /boot partition. Then after configuring the main processor the GP loads the kernel image into ram and starts the main CPU core at the entry point for the ELF binary it just copied into ram.
The problem wasn’t actually getting the system to take my custom kernel and load that. The problem arose when I needed to access the Broadcom SoC’s peripherals such as the GPIO ports. Another major consideration that led to abandoning this idea came when I realized we would be required to interface with a mobile app using with bluetooth or wifi. I wasn’t about to handcraft, or even borrow, a full TCP/IP stack, a full wifi driver, and a robust webserver all of which would have to run on a single core without real process management.
In the end I essentially stumbled upon a couple articles from several years ago (50 or so) involving dedicating entire cpu cores to a single process, or even a single thread in some cases when realtime, or “near-realtime” processing is required. Using a combination of SCHED_FIFO scheduling and sched_setaffinity cpu affinity options I was able to dedicate one of the cores of the pi to the main GPIO thread under linux. This also required a change to the cmdline.txt file for isocpu config option (I think this stands for isolate cpu from scheduler) being set to reserve a couple cores for my realtime threads.
After doing both these steps I had one of my threads running as the only runnable thread for core 2. This thread manages the GPIO calls because reading them out regularly is essential for the correct debouncing of input signal there are no other threads in the system with a higher priority. The data is read into a buffer for later processing. The processing thread was spun-up using this same method on another core to handle the debouncing of the stored data. The final thread wasn’t run with such a high priority but it was run on a specific core. This means the linux scheduler can pre-empt, or pause my execution at any time. This final thread was responsible for making all the decisions based on the current system state.
This setup resulted in acceptable responsiveness for the application. What this wasn’t though was true realtime. I saw timing jitter on the order of tens of milli-seconds or around 10%. This wasn’t a massive deal for out usecase. If it had been I believe the issue lies with the library we used to manage the GPIO interface. If I were to do this over again I would manage the GPIO myself as I missed on my initial investigation the fact that the library does sampling at a configurable rate on its own. I probably didn’t need to duplicate that effort.