A great graph on all the times SEO was pronounced dead, yet the market value of the SEO industry keeps growing. Graph by Chris Lourenco on linkedin.

A great graph on all the times SEO was pronounced dead, yet the market value of the SEO industry keeps growing. Graph by Chris Lourenco on linkedin.

A quick collection of tools you can use to serve/publish content/applications on your local dev to the Interwebs. Some use cases for these types of tools..
List of tools:
That was a pretty long title for the post :). I love nginx for it’s flexibility and ease of use. It is like a swiss army knife.. can do a lot of things :).
We needed to serve some dynamic content for one of our use cases. If user visits a site using the following URL format http://example.com/23456789/678543 , we want to respond with some html content that is customized using the 23456789 and 678543 strings.
A picture might help here

Here’s how this was achieved
location ~ "^/(?<param1>[0-9]{8})/(?<param2>[0-9]{6})" {
root /var/www/html/test/;
index template.html;
sub_filter_once off;
sub_filter '_first_param_' '$param1';
sub_filter '_second_param_' '$param2';
rewrite ^.*$ /template.html break;
}
create a file named template.html with the following content in /var/www/html/test 
Breaking down the config one line at a time
location ~ "^/(?<param1>[0-9]{8})/(?<param2>[0-9]{6})" : The regex is essentially matching for the first set of digits after the / and adding that as the value for variable $param1. The first match is a series of 8 digits with each digit in the range 0-9. The second match is for a series of 6 digits with each digit in the range 0-9 and it will be added as the value for variable $param2
root /var/www/html/test/; : Specifying the root location for the location.
index template.html; : Specifying the home page for the location.
sub_filter_once off; : Specify to the sub_filter module to not stop after the first match for replacing response content. By default it processes the first match and stops.
sub_filter 'first_param' '$param1'; : Direct the sub_filter module to replace any text matching first_param in the response html with value in variable $param1.
sub_filter 'second_param' '$param2'; : Direct the sub_filter module to replace any text matching second_param in the response html with value in variable $param1.
rewrite ^.*$ /template.html break; : Specify nginx to server template.html regardless of the URI specified.
Big thanks to Igor for help with the configs!!
ADP is a $70B+ (by market cap as of August 2019) company and yet cannot get a simple redirect correct. If someone that is asked to use it’s employee performance management system types in tms.adp.com (like most people would do), they get this nice friendly error

If by some magical and mystical reason, they type in https://tms.adp.com, they get this login page

I find it mind boggling that such a mature company cannot figure out
End Rant and sorry to all my friends that work at ADP 🙂
I love when engineering teams share their tricks of trade for other organizations to benefit. While this might seem counter-intuitive, sharing knowledge makes the entire ecosystem better.
Etsy‘ engineering team does a great job of publishing their architecture, methodologies and code at https://codeascraft.com.
This particular article on how they optimize their caching infrastructure (https://codeascraft.com/2017/11/30/how-etsy-caches/) is pretty enlightening. I always thought the best method to load balance objects (app hits, cache requests, queues etc) to hosts was to use mod operations. In this blog post Etsy’ team talk about using consistent hashing instead of modulo hashing.
At a high level, it allows cache nodes to fail and not impact the overall performance of the application drastically in addition to making it easy to scale the number of nodes. This method is useful when you have a large amount of cache nodes.
More reference links
Good blog post by the engineering team at Stripe on using Kubernetes to run a distributed cron scheduler
https://stripe.com/blog/operating-kubernetes
Over the last week, I moved this blog from a LAMP (Linux, Apache, MySQL, PHP) stack to LEMP (Linux, Nginx, MySQL, PHP) stack. Have a blog post in the works with all the gory details, but wanted to quick document a quirk in the WordPress + Nginx combination that broke permalinks on this site.
Permalinks are user friendly permanent static URLs for a blog post. So for example this particular blog post’ URL is
https://kudithipudi.org/2017/02/24/how-to-configure…press-permalinks/
instead of
https://kudithipudi.org/?p=1762
This works by default in Apache because WordPress puts in the required rewrite rules.
To get it work in Nginx, you have to add the following config in the Nginx site configuration
Under the / location context, add the following
try_files $uri $uri/ /index.php?$args;
This is essentially telling Nginx to try to display the URI as is, and if it fails that, pass the URI as an argument to index.php.
Found this good article by Mark Huot regarding using curl to check websocket servers at http://www.thenerdary.net/post/24889968081/debugging-websockets-with-curl
curl -i -N -H "Connection: Upgrade" -H "Upgrade: websocket" -H "Host: echo.websocket.org" -H "Origin: http://www.websocket.org" http://echo.websocket.org
Literally copying this post/note from a blog post by Nate Good http://nategood.com/quickly-add-and-edit-cookies-in-chrome
If you want to inset a cookie into a website request in Google Chrome, you can do it by launching developer tools (F12 in Windows) and typing the following in the console
javascript:document.cookie="myCookieName=myCookieValue"
varnishlog, one of the tools provided with varnish cache, uses VSL Query Expressions (https://www.varnish-cache.org/docs/trunk/reference/vsl-query.html) to provide some powerful insights into the requests and responses.
Here is a how you can use varnishlog to show all client requests that are ending up with a 404 response.
sudo varnishlog -g request -i ReqURL -q "BerespStatus != 200"
Technically, this particular query shows all client requests with a response other than 200.
Breaking down the commands
-g request : shows all entries related to the request
-i ReqURL : forces varnishlog to only display the Requesting URL
-q “BerespStatus != 200” : query filter to only match non 200 responses. Note that the query has to be enclosed in “”.