• 0 Posts
  • 70 Comments
Joined 1 year ago
cake
Cake day: June 15th, 2023

help-circle

  • There’s no one standard…except for faxes.

    HL7 and FHIR have been around for decades. Exchanging data is actually the easy part.

    The problem is typically more on the business logic side of things. Good example is the fact that matching a patient to a particular record between facilities is a much harder problem than people realize because there are so many ways to implement patient identifiers differently and for whoever inputs a record to screw up entry. Another is the fact that sex/gender codes can be implemented wildly differently between facilities. Matching data between systems is the really hard part.

    (I used to do HL7 integration, but have since moved more to the systems side of things).


  • Such a company has little motivation to completely change to something new, since they’d have to retain this for anyone that hasn’t switched.

    They’ve had motivation since the HITECH Act passed in 2009. Medicare/Medicaid compensation is increasingly directly tied to real adoption of modern electronic records, availability, and interoperability. Most healthcare orgs rely heavily on Medicare/Medicaid revenue, so that’s a big, big deal.

    You’re dealing with it first hand, so you know what’s involved.

    I do. Which is why I’m actively and aggressively removing fax machines from our environment. Efaxing (e.g., fax-to-email gateways) will stick around for back-compatibility purposes with outside organizations, but the overall industry trend is to do everything you can to minimize the footprint of fax machines because they’ve traditionally been used in ways that will cost the company serious revenue if they cause you to miss CMS measures.


  • Speaking as someone who works directly in the field: this is just plain factually incorrect. Encrypted email is compliant with patient privacy regulations in the US.

    The issue is entirely cultural. Faxes are embedded in many workflows across the industry and people are resistant to change in general. They use faxes because it’s what they’re used to. Faxes are worse in nearly every way than other regulatory-compliant means of communication outside of “this is what we’re used to and already setup to do.”

    I am actively working on projects that involve taking fax machines away from clinicians and backend administrators. There are literally zero technical or regulatory hurdles; the difficulty is entirely political.



  • I’m guessing you weren’t around in the 90s then? Because the amount of money set on fire on stupid dotcom startups was also staggering.

    The scale is very different. OpenAI needs to raise capital at a valuation far higher than any other startup in history just to keep the doors open another 18-24 months. And then continue to do so.

    There’s also a very large difference between far ranging bad investments and extremely concentrated ones. The current bubble is distinctly the latter. There hasn’t really been a bubble completely dependent on massive capital investments by a handful of major players like this before.

    There’s OpenAI and Anthropic (and by proxy MS/Google/Amazon). Meta is a lesser player. Musk-backed companies are pretty much teetering at the edge of also rans and there’s a huge cliff for everything after that.

    It’s hard for me to imagine investors that don’t understand the technology now but getting burned by it being enthusiastic about investing in a new technology they don’t understand that promises the same things, but is totally different this time, trust me. Institutional and systemic trauma is real.

    (took about 15 years because 2008 happened).

    I mean, that’s kind of exactly what I’m saying? Not that it’s irrecoverable, but that losing a decade plus of progress is significant. I think the disconnect is that you don’t seem to think that’s a big deal as long as things eventually bounce back. I see that as potentially losing out on a generation worth of researchers and one of the largest opportunity costs associated with the LLM craze.


  • Sure, but those are largely the big tech companies you’re talking about, and research tends to come from universities and private orgs.

    Well, that’s because the hyperscalers are the only ones who can afford it at this point. Altman has said ChatGPT 4 training cost in the neighborhood of $100M (largely subsidized by Microsoft). The scale of capital being set on fire in the pursuit of LLMs is just staggering. That’s why I think the failure of LLMs will have serious knock-on effects with AI research generally.

    To be clear: I don’t disagree with you re: the fact that AI research will continue and will eventually recover. I just think that if the LLM bubble pops, it’s going to set things back for years because it will be much more difficult for researchers to get funded for a long time going forward. It won’t be “LLMs fail and everyone else continues on as normal,” it’s going to be “LLMs fail and have significant collateral damage on the research community.”


  • There is real risk that the hype cycle around LLMs will smother other research in the cradle when the bubble pops.

    The hyperscalers are dumping tens of billions of dollars into infrastructure investment every single quarter right now on the promise of LLMs. If LLMs don’t turn into something with a tangible ROI, the term AI will become every bit as radioactive to investors in the future as it is lucrative right now.

    Viable paths of research will become much harder to fund if investors get burned because the business model they’re funding right now doesn’t solidify beyond “trust us bro.”




  • Hardware like that has been and is still being donated through third parties daily.

    It’s more in Ukraine’s interest to limit the use of Starlink to only those terminals that have been vetted through official channels than to allow blanket use and try to filter out things through other means due to… the exact kinds of situations this article is talking about.

    but that would require the CEO of the company to actually want to help honestly.

    Sure. And part of the reason we know Starlink is entirely capable of geofencing is because Elon’s done it explicitly to stop Ukraine from being able to operate near Crimea. That whole kerfuffle lead to military usage being pushed over to Starshield and a contract with the US government that gives them explicit say on when and where Starlink works in Ukraine.

    Elon is dumber than a bag of hammers but it’d be next level stupid even for him to willingly break a DOD contract, especially when people were already floating the idea of invoking the Defense Production Act last time around.




  • As long as you don’t need particularly tight tolerances or fine details, it works perfectly fine. The setup really isn’t anymore complicated than I described. I have done it just because I wanted to see how difficult the process is. It’s around $100 in startup costs assuming you have access to a printer. After that it’s mostly just waiting and occasionally measuring cut progress.

    Check out the Rack Robotics Powercore as well. It’s a low cost wire EDM system that uses cheap 3D printers as a motion platform. It uses a very similar principle to cut metal using wire as the cutting tool. May or may not be more suitable depending on your exact use case. Still pretty rough around the edges though; SendCutSend makes more sense for most people that need things cut from flat stock for the most part.




  • The reality is this is one of a handful of emerging technologies that are going to reshape a lot of things about the world in the future in ways I don’t think society, as a whole, is cogniscent of, let alone prepared for.

    This is one of them. The battlefield use of small drones is another.

    I tend to say that the world we’re living in now is one where gun control is increasingly obsolete. That’s not a moral judgment. It’s not a statement on whether that’s a good or bad thing. It’s just what I think we’re going to increasingly find to be the new reality: the rise of small scale, low cost, divertible manufacturing technologies is going to make traditional supply-side approaches to regulation untenable. That genie is out.

    (Drones are in a similar, if distinct space: low cost, commodity, and divertible from low/no regulation supply chains in a way that makes it nearly impossible to cut off supply without shutting down other legitimate economic activity).

    I don’t know what the right answer is. I do think it’s going to take a pretty fundamental rethink of how we approach these problems. I don’t think the full ramifications of these types of technology have really reached the wider zeitgeist, and, frankly, I kind of worry about how people will react. There are a lot of pretty scary paths this could take, both in terms of how the technology gets used and in terms of what attempts to curb them could look like if they’re not carefully thought through.


  • Machine is borderline overselling it.

    The ECM process works by pumping water containing an electrolyte through a metal part. When a current is applied to the water, exposed metal gets slowly etched away.

    What these groups are doing is starting with high pressure hydraulic pipe and inserting 3D printed jigs that are basically a negative mask to bore out the pipe to their desired diameter, cut the chamber, machine in rifling, etc, with the end product being a functional barrel. As far as I’m aware, so far this has been limited to pistol caliber cartridges; rifle calibers are a step up in pressure and come with a whole host of different engineering challenges.

    The “machine” is really nothing more than a bucket, an aquarium/pond pump, and a desktop power supply. It’s honestly a really clever approach to the problem from an engineering standpoint.