I really liked this slide from Carvana’s 2024 Q3 earnings call presentation on their potential addressable market. Communicates the idea in an easy and impactful way.
howto
30 day challenge : create software with AI
I like to do 30 day challenges to explore new areas, or to form habits. Some of my previous ones were
I am starting a new challenge today, to create software by leveraging AI. The recent boom in AI and GenAI specifically has made it very easy and quick to bring your ideas to fruition. It is time to start coding and developing software for ideas that have been swirling in my head for sometime.
I will be publishing them at https://kudithipudi.org/lab . I will expand and write up about some ideas and the experience in bringing them to life.
Inspired by https://tools.simonwillison.net/.
HOW TO : Run Anthropic Computer Use Tool on a Windows Machine
Anthropic released their new Claude Sonnet 3.5 model yesterday that has a new capability to control computers. Computer Use capability allows Claude to directly interact with computer interfaces, enabling tasks like web browsing, data analysis, and file manipulation – all through natural language instructions. Similar to tools, but now you don’t have to define specific tools. I think this opens up a whole new window of opportunities to leverage LLMs for.
Anthropic shared a quick start guide to run the model in a container, but the instructions are for Mac/Linux based workstations. I had to make some tweaks to run them on a windows workstation.
Documenting them for anyone that might be trying to do the same
- Install Docker Desktop
- Open a command prompt
- Run the following command to set your anthropic api key system variable
set ANTHROPIC_API_KEY=YOUR-ANTHROPIC-KEY
- Run the following command to start the docker container
docker run -e ANTHROPIC_API_KEY=%ANTHROPIC_API_KEY% -v $HOME/.anthropic:/home/computeruse/.anthropic -p 5900:5900 -p 8501:8501 -p 6080:6080 -p 8080:8080 -it ghcr.io/anthropics/anthropic-quickstarts:computer-use-demo-latest
- Launch the streamlit app by opening this URL in your browser http://localhost:8080/
HOW TO : Troubleshoot Zscaler client
I recently encountered some connectivity issues while working from home and trying to access some corporate resources. Notes for myself on some tips our infosec team shared to troubleshoot the Zscaler client since all the traffic to the interweb gets routed through it.
- http://speedtest.zscaler.com/perf
- Gives you an overview of which Zscaler pop you are connecting to and access speed to the Internet via that pop.
- http://127.0.0.1:9000/?ztest?q=@YOUR-CORPORATE-DOMAIN (ex: google.com)
- This provides a detailed report, including:
- DNS Reachability Test: Confirms if DNS is resolving correctly.
- UDP Connectivity Test: Checks if UDP packets can pass through.
- TraceRoute to Zscaler: Shows the path your data takes to reach Zscaler.
- Throttling Test: Identifies any speed drops.
- Download/Upload Bandwidth: Measures the speed at which data transfers.
- This provides a detailed report, including:
- https://ip.zscaler.com
- A quick utility to check where and how your traffic is routed through the Zscaler network. Very similar to the perf test data, but doesn’t let you run a performance test.
On AI Agentic Workflows
Amazing conversation with Bret Taylor on agentic workflows leveraging AI in the enterprises. The whole conversation is worth listening to multiple times, but this specific segment where Bret speaks about the difference between traditional software engineering and AI driven solutions was thought provoking on how much change management organizations have to go through to adopt to these new solutions.
Now if you have parts of your system that are built on large language models, those parts are really different than most of the software that we’ve built on in the past. Number one is they’re relatively slow compared — to generate a page view on a website takes nanoseconds at this point, might be slightly exaggerating, down to milliseconds, even with the fastest models, it’s quite slow in the way tokens are emitted.
Number two is it can be relatively expensive. And again, it really varies based on the number of parameters in the model. But again, the marginal cost of that page view is almost zero at this point. You don’t think about it. Your cost as a software platform is almost exclusively in your head count. With AI, you can see the margin pressure that a lot of companies face, particularly of their training models or even doing inference with high-parameter-count models.
Number three is they’re nondeterministic fundamentally, and you can tune certain models to more reliably have the same output for the same input. But by and large, it’s hard to reproduce behaviors on these systems. What gives them creativity also leads to non-determinism.
And so this combination of it, we’ve gone from cheap, deterministic, reliable systems to relatively slow, relatively expensive but very creative systems. And I think it violates a lot of the conventions that software engineers think about — have grown to think about when producing software, and it becomes almost a statistical problem rather than just a methodological problem.
HOWTO : Bulk deletes in vi
Use “dG” command, if you want to delete all lines in a file starting from the line under the cursor in vi.
Additional commands to delete lines
- dd deletes the whole line under the cursor.
- xdd deletes multiple (x) lines, starting at the cursor. For example 10dd deletes 10 lines
- d$ deletes to the end of the line, starting at the cursor.
HOW TO : Create free clipart
One can leverage the explosion of generative AI art engines to create your own clip art.
- Create an image in MidJourney (you can get free credits to create up to 200 images)
- You can add “clipart” to any image description to get good results
- Use Removebg to remove any background from the image. (500×500 pixel png images are free)
- enjoy 🙂
Here’s a clipart that I created using the prompt “moscow mule drink illustration, clipart”
HOWTO : SQL Joins
A simple and useful visual of joins in SQL by Taylor Brownlow from this post https://towardsdatascience.com/take-your-sql-from-good-to-great-part-3-687d797d1ede
HOWTO : Query json data in SQLite
A self note for querying json data in SQLite. BTW, I think SQLite is an under utilized and under appreciated swiss army tool for data storage and manipulation. And thanks to Richard Hipp, it is free.
If you have a column defined as a json type in your SQLite database, quickest way to search for the data is json_extract
. A full set of functions available are documented at https://www.sqlite.org/json1.html
If you have a column named family_details
in a table family
with the following json in it as an example
{
"father": {
"name": "dad",
"birthday": "1/1/2000",
"pet_name": "daddy"
},
"mother": {
"name": "mom",
"birthday": "1/1/2001",
"pet_name": "mommy"
},
"sons": [
{
"name": "son_one",
"birthday": "1/2/2020",
"pet_name": "sonny_one"
},
{
"name": "son_two",
"birthday": "1/2/2021",
"pet_name": "sonny_two"
}
],
"daughters": [
{
"name": "princess_one",
"birthday": "1/2/2020",
"pet_name": "princy_one"
},
{
"name": "princess_two",
"birthday": "1/2/2021",
"pet_name": "princy_two"
}
]
}
and you want to print the name of the father, you can use
select json_extract(family_details, '$.father.name') as father_name
from family
json_extract
uses the name of the column and the json node as parameters. In this case, we used $
(which denotes the root), father
and name
(under father) as the json node.
HOW TO : Configure nginx to use URI for modifying response content
That was a pretty long title for the post :). I love nginx for it’s flexibility and ease of use. It is like a swiss army knife.. can do a lot of things :).
We needed to serve some dynamic content for one of our use cases. If user visits a site using the following URL format http://example.com/23456789/678543
, we want to respond with some html content that is customized using the 23456789
and 678543
strings.
A picture might help here
Here’s how this was achieved
- Define a location section in the nginx config to respond to the URL path specified and direct it to substitute content
location ~ "^/(?<param1>[0-9]{8})/(?<param2>[0-9]{6})" {
root /var/www/html/test/;
index template.html;
sub_filter_once off;
sub_filter '_first_param_' '$param1';
sub_filter '_second_param_' '$param2';
rewrite ^.*$ /template.html break;
}
create a file named template.html with the following content in /var/www/html/test
Breaking down the config one line at a time
location ~ "^/(?<param1>[0-9]{8})/(?<param2>[0-9]{6})"
: The regex is essentially matching for the first set of digits after the / and adding that as the value for variable $param1. The first match is a series of 8 digits with each digit in the range 0-9. The second match is for a series of 6 digits with each digit in the range 0-9 and it will be added as the value for variable $param2
root /var/www/html/test/;
: Specifying the root location for the location.
index template.html;
: Specifying the home page for the location.
sub_filter_once off;
: Specify to the sub_filter module to not stop after the first match for replacing response content. By default it processes the first match and stops.
sub_filter 'first_param' '$param1';
: Direct the sub_filter module to replace any text matching first_param in the response html with value in variable $param1.
sub_filter 'second_param' '$param2';
: Direct the sub_filter module to replace any text matching second_param in the response html with value in variable $param1.
rewrite ^.*$ /template.html break;
: Specify nginx to server template.html regardless of the URI specified.
Big thanks to Igor for help with the configs!!