Can’t find an attribution for this quote.. but was quoted by Harry Moseley for an interview with the Atlassian team
We have an ocean of data, but a desert of insight
Can’t find an attribution for this quote.. but was quoted by Harry Moseley for an interview with the Atlassian team
We have an ocean of data, but a desert of insight
Great slide deck by Uwe Friedsrichsen on resilient patterns to use when designing applications
I love when engineering teams share their tricks of trade for other organizations to benefit. While this might seem counter-intuitive, sharing knowledge makes the entire ecosystem better.
Etsy‘ engineering team does a great job of publishing their architecture, methodologies and code at https://codeascraft.com.
This particular article on how they optimize their caching infrastructure (https://codeascraft.com/2017/11/30/how-etsy-caches/) is pretty enlightening. I always thought the best method to load balance objects (app hits, cache requests, queues etc) to hosts was to use mod operations. In this blog post Etsy’ team talk about using consistent hashing instead of modulo hashing.
At a high level, it allows cache nodes to fail and not impact the overall performance of the application drastically in addition to making it easy to scale the number of nodes. This method is useful when you have a large amount of cache nodes.
More reference links
I was trying to search for some files on my laptop today and wanted to filter the search for filed modified in the last few weeks. Like, show me all files that contain the word “American” and modified in the last 2 weeks. Doing this on a Linux machine would have been a simple filter using find. But this is Microsoft :).
Thanks to some Googling, I ran across something called “Advanced Query Syntax” that is a core part of Microsoft’ ecosystem (OS, Office etc).
So the same search ended up being
American datemodified:this month
There are a lot of cool ways you can filter your queries using the other keywords in AQS.
Good blog post by the engineering team at Stripe on using Kubernetes to run a distributed cron scheduler
https://stripe.com/blog/operating-kubernetes
Good blog post by Timothy Downs on how queues and data streams work with a layman example at https://hackernoon.com/introduction-to-redis-streams-133f1c375cd3
Quoting the example here
We have a very long book which we would like many people to read. Some can read during their lunch hour, some read on Monday nights, others take it home for the weekend. The book is so long that at any point in time, we have hundreds of people reading it.
Readers of our book need to keep track of where they are up to in our book, so they keep track of their location by putting a bookmark in the book. Some readers read very slow, leaving their bookmark close to the beginning. Other readers give up halfway, leaving theirs in the middle and never coming back to it.
To make matters even worse, we are adding pages to this book every day. Nobody can actually finish this book.
Eventually our book fills up with bookmarks, until finally one day it is too heavy to carry and nobody can read it any more.
A very clever person then decided that readers should not be allowed to place bookmarks inside the book, and must instead write down the page they are up to on their diary.
This is the design of Apache Kafka, and it is a very resilient design. Readers are often not responsible citizens and often will not clean up after themselves, and the book may be the log of all the important events that happen in our company.
When you do your job well, no one notices. When you screw up the whole internet notices – Julia Grace
during a great keynote speech at Velocity 2017 on how to build efficient engineering teams.
Command parameters for varnishlog to view the backend server that is processing the request. In this particular case, I wanted to see the request URL and backend server for any responses with HTTP code 401 (unauthorized access)
sudo varnishlog -i BackendOpen,BereqURL -q "BerespStatus == 401"
Found this good article by Mark Huot regarding using curl to check websocket servers at http://www.thenerdary.net/post/24889968081/debugging-websockets-with-curl
curl -i -N -H "Connection: Upgrade" -H "Upgrade: websocket" -H "Host: echo.websocket.org" -H "Origin: http://www.websocket.org" http://echo.websocket.org
On a server that is running either the FTP client on server, you can capture the ftp password using tcpdump by
tcpdump -A port ftp