Instead of posting individual posts every day to track my progress (or lack of) in the 60 day challenge, I created this page (https://kudithipudi.org/2012-60-day-resolution/) for the daily entries..
Wish me luck :).
Instead of posting individual posts every day to track my progress (or lack of) in the 60 day challenge, I created this page (https://kudithipudi.org/2012-60-day-resolution/) for the daily entries..
Wish me luck :).
I have a confession to make.. I like Big Macs and Krispy Kreme Donuts :). And they have contributed heavily to the increase in my .. hmm.. how do I say this.. mid section :). Plus, it doesn’t help that there is a Krispy Kreme factory and a McDonald’s right on my way to work. And add on to the fact that I haven’t been running for the last year or so, I am proud to say that I have joined the >65% of Americans that are obese.
On my way to work yesterday, I was thinking about what shape (physically) I would be in when Virat grows up. I am sure he doesn’t want to have a dad that can’t play some hoops with him :).
So here’s my 2 month resolution. I am starting with a couple of months because there is a good chance that it might become a habit and then go from there :).
For every pledge I break, I am going to leave work at 5:00 PM for a week. Believe me when I say that is a tough punishment :). You see.. I love what I do :).
Viva La Resistance!!!
Inspired from this blog post by Vaidas Jablonskis. This tip has been tested on Redhat and Centos distributions.
If you ever wanted to log all the commands issued by users on a server, you can edit the default profile configuration to enable this
Jboss uses the log4j framework for providing logging services. log4j is a very flexible framework and can do a lot of things. One of the features provided by log4j is to send log messages to multiple destinations. Here is a quick how to on configuring Jboss to send log messages using the syslog protocol to a syslog server. This is pretty useful, when you are trying to consolidate logs from multiple sources into a central location.
First, some background about how log4j is configured in Jboss
The log4j configuration in Jboss is managed by the file jboss-log4j.xml located at $JBOSS_HOME/server/$JBOSS_PROFILE/conf.
There are three parts to this configuration file
So pictorially, it would look like this
Getting back to the reason for this post, here is how you would enable the syslog appender and then configure a category to use this appender. For this example, we will use a class names org.kudithipudi
Couple of notes..
When troubleshooting performance issues..never take anything for granted..yes, even if something was not touched or restarted, chances are something touching it has been and might have affected it.
This goes esp for the network (IP and fiber) which don’t change as often as the rest of the environment.
A very timely post on Hacker News by Ewan Leith about configuring a low end server to take ~11million hits/per month gave me some more ideas on optimizing the performance of this website. Ewan used a combination of nginx and varnish to get the server to respond to such traffic.
From my earlier post, you might recall, that I planned on checking out nginx as the web server, but then ended up using Apache. My earlier stack looked like this Based on the recommendations from Ewan’s article, I decided to add Varnish to the picture. So here is how the stack looks currently
And boy, did the performance improve or what. Here are some before and after performance charts based on a test run from blitz.io. The test lasted for 60 seconds and was for 250 simultaneous connections.
BEFORE
AFTER
What a difference!!.. The server in fact stopped responding after the first test and had to be hard rebooted. So how did I achieve it? By mostly copying the ideas from Ewan :). The final configuration for serving the web pages looks like this on the server end
Varnish (listens on TCP 80) –> Apache (listens on TCP 8080)
NOTE : All the configuration guides (as with the previous entries of the posts in this series) are specific to Ubuntu.
and you are ready to rock and roll.
There are some issues with this setup in terms of logging. Unlike your typical web server logs, where every request is logged, I noticed that not all the requests were being logged. I guess, that is because varnish is serving the content from cache. I have to figure out how to get that working. But that is for another post :).
In early March 2012, I decided to write at least one blog post per day for the whole month. How did I do? 29 posts in 31 days. I should acknowledge that I cheated a bit :), by blogging two posts in a day, but scheduling them to be published in different days.
My learning from the month long exercise?
Let’s see, how long I can keep it up.
And no.. this is not an April fools joke :).
I had to convert a scanned PDF file into an editable document recently. You can do this using OCR and there is a ton of software out there, that does this. There are even web based services that do this. But each of them had limitations (either had to buy the software or limit in the number of pages that can be scanned). I didn’t want to buy the license, since this is not something I would be doing regularly and the document I had to convert was 61 pages, so none of the online services allowed me to do it. I remembered reading that Google Docs, added this (OCR) capability a while ago and since I have a Google Apps account, I decided to give it a try.
Google also has a limit of 2 pages per OCR conversion. So after some brainstorming, I came up with this quick hack to use Google Docs for converting large PDF files into editable content.
I think someone with more programming chops than me can improve this by using the Google API to do the copy/paste from the smaller docs into the final document :).
If you run into a situation, where you need to search through a bunch of files and print the names of the files that don’t contain a particular string, here is how you do it in Linux
[code]find -name PATTERN_FOR_FILE_NAMES | xargs grep -L STRING_YOU_ARE_SEARCHING_FOR [/code]
The -L option for grep does this (according to the manual)
Suppress normal output; instead print the name of each input file from which no output would normally have been printed. The scanning will stop on the first match.
Quick one liner for capturing traffic destined to and arriving from a host (IP address) using tcpdump and writing it to a file for analyzing later on
[code]tcpdump -s0 host x.x.x.x -w destination.pcap [/code]