Hacking our option to higher crew conferences

Hacking our option to higher crew conferences
Hacking our option to higher crew conferences


Summarization header image

As somebody who takes loads of notes, I’m at all times looking out for instruments and methods that may assist me to refine my very own note-taking course of (such because the Cornell Methodology). And whereas I usually want pen and paper (as a result of it’s proven to assist with retention and synthesis), there’s no denying that expertise may help to reinforce our built-up skills. That is very true in conditions akin to conferences, the place actively collaborating and taking notes on the similar time could be in battle with each other. The distraction of wanting all the way down to jot down notes or tapping away on the keyboard could make it exhausting to remain engaged within the dialog, because it forces us to make fast choices about what particulars are vital, and there’s at all times the chance of lacking vital particulars whereas making an attempt to seize earlier ones. To not point out, when confronted with back-to-back-to-back conferences, the problem of summarizing and extracting vital particulars from pages of notes is compounding – and when thought of at a gaggle stage, there may be significant individual and group time waste in trendy enterprise with a lot of these administrative overhead.

Confronted with these issues every day, my crew – a small tiger crew I prefer to name OCTO (Workplace of the CTO) – noticed a chance to make use of AI to enhance our crew conferences. They’ve developed a easy, and easy proof of idea for ourselves, that makes use of AWS providers like Lambda, Transcribe, and Bedrock to transcribe and summarize our digital crew conferences. It permits us to collect notes from our conferences, however keep centered on the dialog itself, because the granular particulars of the dialogue are mechanically captured (it even creates a listing of to-dos). And at present, we’re open sourcing the instrument, which our crew calls “Distill”, within the hopes that others would possibly discover this handy as nicely: https://github.com/aws-samples/amazon-bedrock-audio-summarizer.

On this submit, I’ll stroll you thru the high-level structure of our venture, the way it works, and offer you a preview of how I’ve been working alongside Amazon Q Developer to show Distill right into a Rust CLI.

The anatomy of a easy audio summarization app

The app itself is simple — and this is intentional. I subscribe to the idea that systems should be made as simple as possible, but no simpler. First, we upload an audio file of our meeting to an S3 bucket. Then an S3 trigger notifies a Lambda function, which initiates the transcription process. An Event Bridge rule is used to automatically invoke a second Lambda function when any Transcribe job beginning with summarizer- has a newly updated status of COMPLETED. Once the transcription is complete, this Lambda function takes the transcript and sends it with an instruction prompt to Bedrock to create a summary. In our case, we’re using Claude 3 Sonnet for inference, but you can adapt the code to use any model available to you in Bedrock. When inference is complete, the summary of our meeting — including high-level takeaways and any to-dos — is stored back in our S3 bucket.

Distill architecture diagram

I’ve spoken many times about the importance of treating infrastructure as code, and as such, we’ve used the AWS CDK to manage this project’s infrastructure. The CDK gives us a reliable, consistent way to deploy resources, and ensure that infrastructure is sharable to anyone. Beyond that, it also gave us a good way to rapidly iterate on our ideas.

Using Distill

If you try this (and I hope that you will), the setup is quick. Clone the repo, and observe the steps within the README to deploy the app infrastructure to your account utilizing the CDK. After that, there are two methods to make use of the instrument:

  1. Drop an audio file immediately into the supply folder of the S3 bucket created for you, wait a couple of minutes, then view the ends in the processed folder.
  2. Use the Jupyter pocket book we put collectively to step via the method of importing audio, monitoring the transcription, and retrieving the audio abstract.

Right here’s an instance output (minimally sanitized) from a current OCTO crew assembly that solely a part of the crew was in a position to attend:

Here’s a abstract of the dialog in readable paragraphs:

The group mentioned potential content material concepts and approaches for upcoming occasions like VivaTech, and re:Invent. There have been strategies round keynotes versus having hearth chats or panel discussions. The significance of crafting thought-provoking upcoming occasions was emphasised.

Recapping Werner’s current Asia tour, the crew mirrored on the highlights like partaking with native college college students, builders, startups, and underserved communities. Indonesia’s initiatives round incapacity inclusion had been praised. Helpful suggestions was shared on logistics, balancing work with downtime, and optimum occasion codecs for Werner. The group plans to analyze turning these learnings into an inside publication.

Different subjects coated included upcoming advisory conferences, which Jeff could attend just about, and the evolving function of the trendy CTO with elevated concentrate on social influence and world views.

Key motion objects:

  • Reschedule crew assembly to subsequent week
  • Lisa to flow into upcoming advisory assembly agenda when out there
  • Roger to draft potential panel questions for VivaTech
  • Discover recording/streaming choices for VivaTech panel
  • Decide content material possession between groups for summarizing Asia tour highlights

What’s extra, the crew has created a Slack webhook that mechanically posts these summaries to a crew channel, in order that those that couldn’t attend can atone for what was mentioned and rapidly overview motion objects.

Keep in mind, AI just isn’t excellent. A number of the summaries we get again, the above included, have errors that want guide adjustment. However that’s okay, as a result of it nonetheless hurries up our processes. It’s merely a reminder that we should nonetheless be discerning and concerned within the course of. Vital pondering is as vital now because it has ever been.

There’s worth in chipping away at on a regular basis issues

This is just one example of a simple app that can be built quickly, deployed in the cloud, and lead to organizational efficiencies. Depending on which study you look at, around 30% of corporate employees say that they don’t complete their action items because they can’t remember key information from meetings. We can start to chip away at stats like that by having tailored notes delivered to you immediately after a meeting, or an assistant that automatically creates work items from a meeting and assigns them to the right person. It’s not always about solving the “big” problem in one swoop with technology. Sometimes it’s about chipping away at everyday problems. Finding simple solutions that become the foundation for incremental and meaningful innovation.

I’m particularly interested in where this goes next. We now live in a world where an AI powered bot can sit on your calls and can act in real time. Taking notes, answering questions, tracking tasks, removing PII, even looking things up that would have otherwise been distracting and slowing down the call while one individual tried to find the data. By sharing our simple app, the intention isn’t to show off “something shiny and new”, it’s to show you that if we can build it, so can you. And I’m curious to see how the open-source community will use it. How they’ll extend it. What they’ll create on top of it. And this is what I find really exciting — the potential for simple AI-based tools to help us in more and more ways. Not as replacements for human ingenuity, but aides that make us better.

To that end, working on this project with my team has inspired me to take on my own pet project: turning this tool into a Rust CLI.

Building a Rust CLI from scratch

I blame Marc Brooker and Colm MacCárthaigh for turning me right into a Rust fanatic. I’m a methods programmer at coronary heart, and that coronary heart began to beat so much sooner the extra acquainted I received with the language. And it turned much more vital to me after coming throughout Rui Pereira’s wonderful research on the vitality, time, and reminiscence consumption of various programming languages, after I realized it’s great potential to assist us construct extra sustainably within the cloud.

Throughout our experiments with Distill, we wished to see what impact shifting a perform from Python to Rust would appear like. With the CDK, it was simple to make a fast change to our stack that allow us transfer a Lambda perform to the AL2023 runtime, then deploy a Rust-based model of the code. If you happen to’re curious, the perform averaged chilly begins that had been 12x sooner (34ms vs 410ms) and used 73% much less reminiscence (21MB vs 79MB) than its Python variant. Impressed, I made a decision to essentially get my palms soiled. I used to be going to show this venture right into a command line utility, and put a few of what I’ve discovered in Ken Youens-Clark’s “Command Line Rust” into apply.

I’ve at all times cherished working from the command line. Each grep, cat, and curl into that little black field jogs my memory a number of driving an outdated automobile. It might be slightly bit more durable to show, it would make some noises and complain, however you’re feeling a connection to the machine. And being lively with the code, very similar to taking notes, helps issues stick.

Not being a Rust guru, I made a decision to place Q to the take a look at. I nonetheless have loads of questions in regards to the language, idioms, the possession mannequin, and customary libraries I’d seen in pattern code, like Tokio. If I’m being trustworthy, studying easy methods to interpret what the compiler is objecting to might be the toughest half for me of programming in Rust. With Q open in my IDE, it was simple to fireplace off “silly” questions with out stigma, and utilizing the references it supplied meant that I didn’t must dig via troves of documentation.

Summary of Tokio

Because the CLI began to take form, Q performed a extra important function, offering deeper insights that knowledgeable coding and design choices. As an illustration, I used to be curious whether or not utilizing slice references would introduce inefficiencies with massive lists of things. Q promptly defined that whereas slices of arrays might be extra environment friendly than creating new arrays, there’s a chance of efficiency impacts at scale. It felt like a dialog – I may bounce concepts off of Q, freely ask observe up questions, and obtain speedy, non-judgmental responses.

Advice from Q on slices in Rust

The very last thing I’ll point out is the characteristic to ship code on to Q. I’ve been experimenting with code refactoring and optimization, and it has helped me construct a greater understanding of Rust, and pushed me to suppose extra critically in regards to the code I’ve written. It goes to indicate simply how vital it’s to create instruments that meet builders the place they’re already snug — in my case, the IDE.

Send code to Q

Coming quickly…

Within the subsequent few weeks, the plan is to share my code for my Rust CLI. I would like a little bit of time to shine this off, and have of us with a bit extra expertise overview it, however right here’s a sneak peek:

Sneak peak of the Rust CLI

As at all times, now go construct! And get your palms soiled whereas doing it.

Leave a Reply

Your email address will not be published. Required fields are marked *