Chris Padilla/Blog


My passion project! Posts spanning music, art, software, books, and more. Equal parts journal, sketchbook, mixtape, dev diary, and commonplace book.


    Deploying TypeScript to AWS Lambda

    In the early days of TypeScript, one of the larger barriers to entry was the setup required. Setting your configuration and checking if external packages ship with types took upfront work. On top of it all, neither Node nor the browser reads TypeScript directly, so transpiling to JavaScript is required for those environments.

    Much has improved since. Libraries ship with types and spinning up a project has been streamlined.

    Below I'll share some of the tooling that's helped simplify TypeScript setup.

    The Project

    I'll be working on setting up a TypeScript project that will deploy to AWS Lambda. I'll skip the details that are specific to Lambda setup and focus on TypeScript itself.

    For this to work, there are a few things we'll want to make happen:

    1. Setup Type Checking
    2. Setup a Build Process
    3. Optionally: Select a TypeScript Runtime

    Type Checking

    The biggest benefit of TypeScript comes from... well, the static type checking! An editor such as VS Code can lint these for you while you develop. Though, the intended safeguarding comes from compile time type checking.

    TypeScript comes with this out of the box. Here's how you can set it up:

    First, we'll install TypeScript globally through npm:

    npm install -g typescript

    With that comes tsc, the TypeScript Compiler.

    If you haven't already, you'll want to initialize your project with a tsconfig.json file. This command gets you started:

    tsc --init

    Here's a starting place for your ts config:

    {
      "compilerOptions": {
        "target": "es2020",
        "module": "es2020",
        "strict": true,
        "skipLibCheck": true,
      },
      "ts-node": {
        "compilerOptions": {
          "module": "commonjs"
        }
      },
      "exclude": ["node_modules", "**/*.test.ts"]
    }

    Lastly, to compile, it's as simple as this command:

    tsc index.ts

    This will spit out a corrseponding JavaScript file in your project with the types stripped out.

    Worth noting: You can also check for types without compiling with the --noEmit flag.

    tsc index.ts --noEmit

    Testing Locally

    You may notice above the ts-node option in my config. ts-node is an engine for executing TS files using the node runtime — without having to transpile your code first.

    What we would have to do without ts-node is generate our JS files as we did above, such as with tsc index.ts. An index.js file would then be generated. From there, we would run node index.js.

    Instead, with ts-node, we would simply call ts-node index.ts.

    ts-node comes with many more features, but a single-command way of running TS files from the CLI is the quickest benefit.

    Bundling with ESBuild

    Typically, we reach for bundling solutions with client side JavaScript and TypeScript to minimize our file sizes, speeding up site load times. While you wouldn't normally need to bundle server side code, the current AWS Lambda limit is 250 MBs. The node_modules directory would easily eat that up without a bundling strategy!

    The library of choice today is ESbuild, which handles TypeScript, JSX, ESM & CommonJS modules, and more.

    You might ask: If you're going to bundle your code, why did we bother looking at compiling with tsc?

    There are several tools that will run and build TypeScript without actually validating your types, and ESBuild is one of them! When developing your build pipeline, it's likely that you'll need a separate step to validate the types with tsc.

    Here is what the build script looks like using ESBuild:

    esbuild ./src/index.ts --bundle --sourcemap --platform=neutral --target=es2020 --outfile=dist/index.js

    A couple of options to explain:

    • sourcemap: This generates a .map.js file which is used for error handling. This makes sane debugging possible even after bundling and minifying.
    • platform=neutral: Sets default output to esm, using the export syntax.
    • target=es20202: Targets a specific JS spec, also including esm modules.

    Picking a Runtime

    If including esm modules in your generated JS files, be sure you're using a runtime that supports them. For example, Node 13 can handle them out of the box, while earlier versions require an experimental flag.

    When deploying to Lambda, Node is a first class citizen when it comes to support. While not quite as blazingly fast as Rust, a Lambda function running node will still be highly performant.

    If you're interested in delightful DX and native TypeScript support however, you may reach for Deno or Bun.

    I'll baton pass this portion of the article to two relevant docs: The AWS Lambda Developer Guide on Building with TypeScript and the Bun Lambda Layer package. Whichever you chose, both should be great starting places for deploying your runtime of choice.


    Peg O My Heart

    Listen on Youtube

    Another swing at chord melody!


    City Sunset

    🌆

    Thinking back to a visit to Chicago...


    The Retrieval-Augmented Generation Pattern for AI Development

    Yes ladies and gentleman, a post about developing with AI!

    If you're team is looking to incorporate an LLM into your services, the first challenge to overcome is how to do so in a cost effective way.

    Chances are, your business is already focused on a specific product domain, with resources targeted towards building that solution. This already is going to point you towards finding an off the shelf solution to integrate with an API.

    With your flavor of LLM picked, the next set of challenges center around getting it to respond to questions in a way that meaningfully provides answers from your business data. LLM's need to be informed on how to respond to requests, what data to utilize when considering their answers, and even what to do if they're tempted to guess.

    The way forward is through prompt engineering, with the help of Retrieval-Augmented Generation

    Retrieval-Augmented Generation

    The simplified procedure for RAG goes as follows:

    1. Request is made to your app with the message "How many Tex-Mex Restaurants are in Dallas?"
    2. Your application gathers context. For example, we may make a query to our DB for a summary of all restaurants in the area.
    3. We'll provide a summary of the context and instructions to the LLM with a prompt.
    4. We send along the response to the user.

    That's an overly simplified walk through, but it should already get you thinking about the details involved in those steps depending on your use case.

    Another benefit to this is that requests to an API are not inherently stateful. The chat window of an AI app will remember our previous messages. But my API request to that third party does not automatically. We have to store and retrieve that context.

    AI Agents

    It's worth noting that step 2 may even require an LLM to parse the question and then interact with an API to gather data. There's a fair amount of complexity to still parse in developing these solutions. This is where you may be leaning on an AI Agent. An agent is an LLM that will parse a result and determine if a tool is required, such as pinging your internal APIs.

    Prompt Engineering is emerging as a role and craft all of it's own, and there are many nuances to doing it well.

    LangChain

    The workflow is already so common that there's a framework at the ready to spin up and take care of the heavy lifting for you. LangChain (stylized as 🦜⛓️‍💥) is just that tool.

    For a hands on experience building a RAG application on rails, their docs on building a chatbot are a good starting place.

    For a more complex agentive tool, LangGraph opens up the hood on LangChain for more control and plays nicely with LangChain when needed.


    Campfire Folk Intro

    Listen on Youtube

    Gather round, and listen to this tale...

    Just a bit of noodling between practicing longer pieces.


    Calm Sky

    🦋

    It's grey out this time of year. But behind the clouds, there's always a blue sky. 🌤️


    Extending Derived Class Methods in Python

    Polymorphism! A sturdy pillar in the foundation of Object Oriented Programming. At it's simplest, it's the ability to change the implementation of specific methods on derived classes.

    At face value, that could mean entirely rewriting the method. But what if we want a bit more nuance? What if we want to extend instead of replace the method entirely.

    I'll continue on my Vehicle example from my previous post on polymorphism, this time in Python:

    from abc import ABC
    
    class Vehicle(ABC):
        def __init__(self, color: str):
        
            if not color:
                raise ValueError("Color string cannot be null")
                
            self._passengers = []
            self.color = color
    
        def load_passenger(self, passenger: str):
            # Logic to load passenger
    
        def move(self):
            # Some default code for moving
            print("Moving 1 mile North")

    I've created an Abstract Base Class that serves as a starting point for any derived classes. Within it, I've defined a method move() that moves the vehicle North by 1 mile. All children will have this class available automatically.

    Now, if I want to override that method, it's as simple as declaring a method of the same name in my child classes:

    class Car(Vehicle):
        def move(self):
            print("Driving 1 mile North")
    
    
    class Boat(Vehicle):
        def move(self):
            print("Sailing 1 mile North")

    In the case that I want to extend the functionality, we can use Super() to do so:

    class Car(Vehicle):
        def move(self):
            super().move()
            print("Pedal to metal!")
    
    
    class Boat(Vehicle):
        def move(self):
            super().move()
            print("Raising the sail!")

    The benefit here is I can pass all the same arguments I'm receiving in the method call on either child instance to the default implementation in the parent. They can then be used in my own custom implementation in the child class.

    car = Car()
    car.move()
    # Moving 1 mile North
    # Pedal to metal!

    Angel Eyes

    Listen on Youtube

    'Scuse me while I disappear~ 🌫️


    Parkway

    🌳

    Getting ready to move neighborhoods next month. The current place is just walking distance from a beautiful trail. So I'm savoring it while we're still here!


    All the Things You Are Chord Melody

    Listen on Youtube

    My first swing at chord melody! Love this tune even more on guitar.


    Night Lake

    🌙

    The veil is thin. 👻

    Squeaked in one Inktober drawing this year! Very much directly inspired by the energetic ink work of Violaine Briat's Lil' Dee.


    TypedDicts in Python

    So much of JavaScript/TypeScript is massaging data returned from an endpoint through JSON. TypeScript has the lovely ability to type the objects and their properties that come through.

    While Python is not as strongly typed as TypeScript, we have this benefit built in to the type hinting system.

    It's easier shown than explained:

    from typing import Union, TypedDict
    from datetime import datetime
    
    
    class Concert(TypedDict):
        """
        Type Dict for concert dictionaries.
        """
    
        id: str
        price: int
        artist: str
        show_time: Union[str, datetime]

    All pretty straightforward. We're instantiating a class, inheriting from the TypedDict base class. Then we set our expected properties as values on that class.

    It's ideal to store a class like this in it's own types directory in your project.

    A couple of nice ways to use this:

    First, you can use this in your methods where you are expecting to receive this dictionary as an argument:

    def get_concert_ticket_details(
            self, concert: UnitDict = None
        ) -> tuple(list[str], set[str]):
        // Do work

    You can also directly create a dictionary from this class through instantiation.

    concert = Concert({
        "id": "28",
        "price": 50,
        "artist": "Prince",
        "show_time": show_time
    })

    The benefit of both is, of course, the suggestion in your editor letting you know that a property does not match the expected shape.

    More details on Python typing in this previous post. Thorough details available in the official docs.


    Sonny Rollins – Oleo

    Listen on Youtube

    Today I learned that this jazz standard is named after margarine. Yum!


    From the Other Side

    🌑

    We have bobcats and coyotes on the other side of the lake near our home. You can hear them at night. Every now and then, I see one looking back at me 🐺


    Optimistic UI in Next.js with SWR

    I remember the day I logged onto ye olde facebook after a layout change. A few groans later, what really blew me away was the immediacy of my comments on friends' posts. I was used to having to wait for a page refresh, but not anymore! Once I hit submit, I could see my comment right on the page with no wait time.

    That's the power of optimistic UI. Web applications maintain a high level of engagement and native feel by utilizing this pattern. While making an update to the page, the actual form submission is being sent off to the server in the background. Since this is more than likely going to succeed, it's safe to update the UI on the page.

    There are a few libraries that make this process a breeze in React. One option is Vercel's SWR, a React hook for data fetching.

    Data Fetching

    Say I have a component rendering data about several cats. At the top of my React component, I'll fetch the data with the useSWR hook:

    const {data, error, isLoading, mutate} = useSWR(['cats', queryArguments], () => fetchCats(args));

    If your familiar with TanStack Query (formerly React Query), this will look very familiar. (See my previous post on data fetching in React with TanStack Query for a comparison.)

    To the hook, we pass our key which will identify this result in the cache, then the function where we are fetching our data (a server action in Next), and optionally some options (left out above.)

    That returns to us our data from the fetch, errors if failed, and the current loading state. I'm also extracting a bound mutate method for when we want to revalidate the cache. We'll get to that in a moment.

    useSWRMutation

    Now that we have data, let's modify it. Next, I'm going to make use of the useSWRMutation hook to create a method for changing our data:

    const {mutate: insertCatMutation} = useMutation([`cats`, queryArguments], () => fetchCats(args)), {
            optimisticData: [generateNewCat(), ...(data],
            rollbackOnError: true,
            revalidate: true
        });

    Note that I'm using the same key to signal that this pertains to the same set of data in our cache.

    As you can see, we have an option that we can pass in for populating the cache with our optimistic data. Here, I've provided an array that manually adds the new item through the function generateNewCat(). This will add my new cat data to the front of the array and will show on the page immediately.

    I can then use the mutate function in any of my handlers:

    const {error: insertError} = await insertCatMutation(generateNewCat());

    Bound Mutate Function

    Another way of accomplishing this is with the mutate method that we get from useSWR. The main benefit is we now get to pass in options when calling the mutate method.

    const handleDeleteCat = async (id) => {
        try {
            // Call delete server action
            deleteCat({id});
            
            // Mutate the cache
            await mutate(() => fetchCats(queryArguments), {
                // We can also pass a function to `optimistiData`
                // removeCat will return the current cat data after removing the targeted cat data
                optimisticData: () => removeCat(id),
                rollbackOnError: true,
                revalidate: true
            })
        } catch (e) {
            // Here we'll want to manually handle errors
            handleError(e);
        }
    }

    This is advantageous in situations like deletion, where we want to sequentially pass in the current piece of data targeted for removal. That can then be passed both to our server action and updated optimistically through SWR.

    For even more context on using optimistic UI, you can find a great example in the SWR docs