Don’t call it “accidental database deletion”. Call it “unannounced live exercise of backup procedure”.
Either that or rapid exodus from the country
Embrace it as you are now officially a developer
At some point this is just a rite of passage
I don’t even work with databases but I still have a fear of possibly doing this someday.
If it happens, it’s a lack of controls, and, at least mostly, not your fault
You just triggered my PTSD
DB admins rawdogging the prod postgres on a Friday evening.
I did something like that once, I wasn’t very good at SQL but I needed some data, so I logged in into the production database and run my SELECT queries, I didn’t change anything so everything was good, or so I thought.
I created a cross product over tables with millions of entries and when it didn’t respond I thought it was odd but it was time to go home anyway. On the way home they called me and asked what I did. They had to restart the DB server because once the cache timed out one application after another started failing.
Sometimes I dream of getting fired for accidentally doing shit like this. Sweet relief…
Good companies wouldn’t fire someone for this because:
-
There should be processes in place to prevent this, or recover from this, anyway. It’s a team/department failure and you would just be the straw that broke the camel’s back.
-
They now know you’ve experienced this and will hopefully know to never do it again. Bringing in someone else could just reintroduce the issue.
-