EUWAN KIM '24
Weston Geophysical | A division of Applied Research Associates, a Defense Threat Reduction Agency partner
Week 4: weston_geophysical.exit()
Hi everyone! Today is my last day at ARA, and I just finished giving my final seminar to the rest of the Weston Geophysical staff. It's been super bittersweet saying my final goodbyes to everyone, and I'm so grateful to Weston Geophysical, ARA, NSERC/DTRA, and most of all my mentor Mr.Bolton for this opportunity. I was asked several times if this experience has convinced me to become a scientist or an engineer (seismologists are always trying to recruit), and while I still don't know what major or career path is best for me yet, I'm not so scared to try out a completely new field anymore.
Mr.Bolton once said that seismologists are like data socialists, they share their data all the time. So here are the results of my experiment. I varied the number of samples per signal and the number of classes. Overall, the network performed pretty well!
Applications for AI:
I don't want to come off as vain but for the last application for AI, I'll be talking about my own neural network. So why is seismology important in American Grand Strategy? Nuclear weapons and other weapons of mass destruction have dominated discussions about national security for the last decade, so the US Department of Defense has always been improving their own nuclear arsenal or developing counter-WMD technology. Seismic signals are one of the most efficient and effective ways to monitor nuclear movement, whether it be a missile launch or an explosion.
With a neural network that can classify any seismic signal, you can begin to collect data on different movements. A 10,000lb TNT explosion that occurs in an underground laboratory at 400ft depth and 5 miles away emits different signals than an above-ground but submerged-in-water missile launch an entire ocean away. Ultimately, an advanced enough neural network could have explosive source prediction capacity that can determine the physical source, type, and yield of nuclear movement. This better prepares the DoD for rapid response and expands our deterrence capabilities. Obviously, my neural network can't do that without signal data, but Weston Geophysical has actually spent the last couple of months conducting experiments for this kind of project.
Life in Arlington:
I'm spending the last few days revisiting my favorite Arlington restaurants and taking full advantage of the Army's per diem. It's been an exciting experience to live in a city full of young professionals, especially one that's been changing so rapidly the last few decades. If you ever find yourself working in DC, I definitely recommend Arlington as the perfect suburb-city hybrid. It's got a ton of community spirit, easy access to the metro, and an upscale-but-still-affordable culture.
Week 3.5: print("Progress Complete!")
Hi again, everyone! My internship is starting to wrap up! I'm giving a seminar to the rest of Weston Geophysical this Friday on what I've done during the internship, so this week was spent designing experiments and writing brief reports on them.
After days of data processing, I've finally created a neural network that can classify different seismic signals! As of right now, it can identify four classes (explosion, earthquake, sine wave, and a chirp) with up to 90% accuracy, but as long as there's more data, the neural network could work for any type of signal. Mr.Bolton and I have been thinking of some new ways to test the limits of the neural network, so I decided to design my own experiment. I reprocessed the data so there were two variations of each signal: one with 8,000 samples and one with 20,000 (remember our discussion about how neural networks learn better with more data). I also varied the number of classes: sine waves and chirps only, explosions and earthquakes only, and then all four classes. I'm excited to be doing the science part of computer science, even though it's something I haven't done since my high school biology class.
Applications for AI:
For the last few weeks, I've talked about all kinds of machine learning mechanisms, and I hope it's showed that this field has so much potential for growth. As someone interested in American Grand Strategy and as an Army ROTC cadet, I've always been curious about its applications in the military. You can think of everything from a palm-sized unmanned drone to an augmented reality headset to Palantir, which definitely has its history with Duke students.
The Advanced Targeting and Lethality Automated System in particular has invited a lot of debate over whether we should let autonomous or AI-powered combat vehicles decide who lives and who dies. These conversations aren't unique to combat-related technology; I remember when Tesla's self-driving cars first rolled out a lot of discussion about whether the car should prioritize the driver or a school bus full of children next to it. Even more controversial is the $10 million Army-Libratus contract. The Army wants to adapt Libratus, a computational game theory AI originally developed to win poker games, to make strategic decisions not only in the field but also maybe in foreign policy.
Life in Arlington:
I actually spent last weekend outside of Arlington in a National Park called Harpers Ferry, which is on the border between West Virginia and Maryland. I noticed that even just 30 minutes outside of Arlington is complete wilderness. Arlington residents tend to be very active, and I've gotten so many hiking trail recommendations that I practically memorized the terrain here. The hotel wifi has become unbearably slow and the weather's too nice to be stuck in the office, so I've been making an effort to travel outside of my one-block hotel-office-Chipotle radius.
Week 3: euwan.data = Not Found
May 21, 2021
Welcome back, everyone! One week left in my internship!
I thought that making the neural network would be the most difficult part of my internship. I was very wrong. Preparing the enormous amount of data to be used by the neural network has been the biggest challenge so far. Neural networks are very particular about their inputs; the data must be in the form of an array (think of that as sort of a matrix), and the data must also be balanced, meaning that it must have an equal number of samples for each "class"/object it's trying to identify. For the past week, Mr.Bolton and I have been gathering seismic data from various chemical explosions, earthquakes in Northern California, sine waves, and chirps. It took only a few minutes to build the neural network, but it's taken 8 days to prepare the data and I'm still not even close to finished. Once I refine the data, though, the neural network is ready to train!
Also, the fashion MNIST has been finished! It's relatively successful with accuracy reaching 90% and loss at around 0.2. If you're interested, you can view my full code here.
Important tip for beginning programmers: when first learning programming, tutorials often give you data that has been pre-processed. It didn't take long before I became really dependent on pre-processed data. When learning new fields within programming, make sure to not only learn how to code in that new field but also to understand the input data's attributes. Particularly with neural networks, data for each network must be fine-tuned to the network's needs and functions. Becoming too reliant on pre-processed data left me really confused when I had to start processing it on my own!
Applications for AI:
If you've ever used Gmail or iMessage, you've probably noticed that after you type just a couple of words, your computer or phone starts suggesting ways to finish your sentences. For instance, whenever I start typing "Have a" into Gmail, I get suggested "great day!" to finish my sentence. Sometimes, if I type something even as short and ambiguous as "I hope", Gmail suggests "you can finish this project by next week." How can my phone predict what I'm thinking? Last week, I talked about CNNs (Convolutional Neural Networks) but this week, let's talk about RNNs (Recurrent Neural Networks).
RNNs are all about how much data you have. Even a simple RNN would require millions of sentences and word combinations to train itself to at least a decent level of utility. When you pass the data through an RNN, each word is mapped to its own vector that describes its place in the sentence as well as its distance from other words. The probability of a certain word appearing near another word is then calculated for each vector. The RNN is trained with various "weighted averages" of these probabilities until it can successfuly predict what words come before and after one input word. More advanced RNNs can also map words by meaning, not just location. For instance, the word "queen" could be mapped to "ruler" and "power" even if those three words don't appear in a sentence together very often in the data. This of course requires a ton of data pre-processing since a human has to deduce and program connections between meanings, which is why only large tech conglomerates like Google and Apple can heavly invest in developing those RNNs. Gmail actually uses an even more advanced RNN called long-short term memory (LSTM), which is how it can recommend you entire sentences based on everything else you've written in the email.
However, through the collective effort of small to large computer scientists, there are now programs and apps like GPT3 or Ghost Writer that can write entire essays just from one input sentence. This New Yorker article is written partly by a machine! I once had GPT3 write me a cover letter for a grocery store cashier position using only a few bullet points from my resume. It was incredibly choppy, and I doubt I would've gotten the job at Stop and Shop using that letter, but I was still really impressed with the machine's ability to match each bullet point to a qualification that it thought the job necessitated.
Life in Arlington:
Summer has finally arrived in Arlington! With the new CDC mask mandates slowly being lifted, restaurants are starting to get packed again - which of course makes it more time-consuming for me to get tacos. ARA has also lifted its mask mandate for vaccinated employees, and the office is almost at 2/3 capacity now. It makes me a little sullen that just when I get to meet more people, I have only a week left here. More interns have joined me, and I now have an office roommate!
Week 2: count(weeks) = 2
May 16, 2021
Hello, everyone! I just passsed the halfway mark of my internship!
I am now on the last chapter of the NNFS textbook! All that's left to do is create a neural network that classifies using Fashion MNIST - a "fashion" version of the previously discussed MNIST database, but instead of digits, the images are of various pieces of clothing. It's basically putting everything I've learned to the test, and if I can do this correctly, I can move on to adapting my neural network for seismic data. It's kind of crazy that just two weeks ago, it took me two very confusing hours just to install the Python app on my laptop.
A tip for beginning programmers: do not get discouraged! Last week's coding progress was a bit slow, because I got caught up in all the tiny intricacies and complications of programming. Once those complications start building up, it can feel like all your efforts are pointless. But just look ahead to when you do finish your project. Think about all the contributions that small-time programmers have made to computer and data science, even if it's just by practicing basic coding. There's no point to developing this technology if no one ever uses it. So if you're facing a lot of challenges, just know that your time and energy are never pointless.
Applications for AI:
Wednesday's application for AI discussed CNNs, a machine learning mechanism that can classify parts of an image. There are so many uses of CNNs in the medical world, everything from identifying diseases to determining risk of admission. One team of researchers at Duke tackled the challenge of using CNNs to diagnose thyroid cancers.
It's often difficult for pathologists to identify thyroid malignancies, especially since they can be incredibly small and thyroids have so much surface area. Hence, many cases of thyroid cancer are diagnosed as "indeterminate", meaning the doctor cannot conclude whether the person's cancer is malignant or benign. This can also make CNNs difficult to use for thyroid malignancies; most CNN datasets include millions of 256 x 256 pixel images. But slides of thyroid biopsies can be up to 100,000 x 150,000 pixels, and the team only had access to around a thousand of them. In general, a machine learns best when the number of data it's working off of is greater (i.e. larger than a thousand).
The researchers discovered that the thyroid biopsy slides could be divided into multiple image regions, and predictions could be averaged to get a single value for the entire slide. The CNN is constructed by a filter consisting of shapes that resemble growing tumors, and it's "slid" over each image region of the biopsy slide.
While the CNN had the same performance as a human in determining if a cancer was malignant or benign, it did suceed in reducing number of indeterminate cases by pathologists, ultimately reducing unnecessary surgeries by 20%! So the machine can't be trusted to make all diagnoses and medical decisions yet (and of course, there are so many policies and regulations to consider with that as well). A human-machine hybrid approach, though, could drastically improve thyroid cancer diagnoses!
Note: The research team included David Dov, Shahar Kovalsky, Jonathan Cohen, Danielle Range, Ricardo Henao, and Lawrence Carin. The full study can be read here.
Life in Arlington:
Arlington is really starting to grow on me. The past weekend, there was a huge festival in Ballston called QuarterFest, which all Arlington residents took as an opportunity to party for three days straight. All restaurants and bars fully opened and offered insane discounts, and there was live music on almost every street and park. I had some of the best meals I've ever had in my life for really cheap! Arlington hosts a ton of small-city public events and festivals like QuarterFest, and people always show up and go all-out. They're very proud of their city, which differs from my very disgruntled New Jersey hometown.
I hope everyone has been enjoying my blog so far, and I'll see you in a few days when I've (hopefully) created a full neural network!
Week 1.5: import programming as struggles
May 12, 2021
Hello again, everyone! It's now the second week of my internship, and I'm really in the thick of it now.
I'm about 2/3 of the way through the NNFS textbook, but progress has definitely slowed down. Although my neural network has reached an accuracy of 98% (yay!), I'm having trouble understanding how each piece fits into the greater context of the textbook. It's been difficult staying motivated, but Mr.Bolton has done an amazing job of walking me through all my plateaus. I'm hoping that as I get closer to actually applying my neural network to Weston Geophysical's data, some of the excitement trickles back in.
Another tip for beginning programmers: Google is your best friend! Even though my mentors always say that no question is stupid, I'm often afraid of asking too many trivial questions (no doubt that a few of them are indeed pretty stupid). Whenever I get an error on Python, I first copy and paste it into Google and read a few search results before resorting to frantically asking Mr.Bolton for advice. There definitely are other programmers that have run into the same problems you're having, and there's a ton of resources online to help you.
Applications for AI:
If you've ever flown out of the JFK airport in NY before, you'd probably agree with me in saying that it would take more time to go through security there than to just walk to your destination. However, without something called Convolutional Neural Networks (CNNs), your hassles with TSA would take much longer. CNNs are a machine learning mechanism usually used for image analysis, including X-ray scans of your carry-on luggage.
CNNs are really powerful because they can easily recognize shapes such as lines, dots, and circles. A programmer could combine dots, lines, and circles to construct a CNN filter with the same shape as what they're looking for. The filter is then "slid" over the image to see if the image's shape fits that of the filter. For instance, if you wanted the CNN to identify a car, you might create a filter with a rectangle on top of a few circles (I admit, a very rudimentary car). The CNN "slides" the filter over an image of your Toyota Prius, and since the Prius' shape overlaps the general shape of a rectangle and a few circles, the CNN identifies it as a car.
TSA constructs filters that specifically point out objects they would deem as life-threateningly dangerous (i.e. a bottle of toothpaste larger than 3.4 ounces). Every time a piece of luggage passes through the scanner, an X-ray image of it is taken. TSA's "dangerous object" filters are slid over the X-ray images, and if none of the objects inside the luggage overlap with the shape of a large bottle of toothpaste, it's not flagged. Without CNNs, multiple TSA agents would have to closely observe each X-ray image to deem it safe. So the next time you're in line at JFK security, be grateful for the CNNs that have made your wait time only 3 hours instead of 10.
Life in Arlington:
I think it's important to note that Arlington is a huge sports town. Everyone here is either a Caps, Nationals, or Virginia Tech fan. There's of course a few UNC stragglers here and there as well (unfortunately). In a few days, all sports events are opening at full capacity, and I can feel the impatience radiating from Arlington residents. Even in the office, all break room conversations have been about what sports game everyone is attending this weekend. Since the Caps' arena is only a block away from my hotel, I've been mentally preparing myself for the streets to be overflowing. As a Jersey Devils and Blue Devils fan, I definitely feel like I'm in the minority here.
Thank you all for reading, and I'll see you in a few days!
Week 1: find("Euwan") = Arlington, VA
May 9, 2021
Welcome back, everyone! I've finally finished my first week with Weston Geophysical, and my brain is pretty fried from quite literally learning Python and linear algebra from scratch. In case anyone's been wondering where and how exactly I've been working on this project, I have my own office! Most ARA employees are still working remotely, although as more people get vaccinated, the office has slowly been getting busier. But since almost 2/3 of the office is still remote, I've been granted the empty office of a nuclear scientist named Alok Neopane. Thanks, Alok!
I'm about a third of the way through the NNFS textbook, so at this point, I've created a very rudimentary neural network. It just hasn't been optimized yet, meaning that it doesn't improve much over several trials. Right now, accuracy levels out to about 30%, which is not great by scientists' standards... or really anyone's standards. I meet with my mentor Mr.Bolton almost twice a day to ask questions (and believe me, I have a lot), so I'm really grateful that he's been really willing to help me every step of the way. It's sort of like having my own personal TA!
For other non-STEM students who are considering or also learning programming, here's a piece of advice I've gotten from Mr.Bolton: write down everything. It's easy to just copy and paste a tutorial's code onto your computer and think "oh, I'll remember what this means later." Trust me, you won't. I've been writing the context behind each piece of code in a separate notebook that I can skim at the end of the day to review what I've learned.
Applications for AI:
We can't talk about AI without introducing one of the most famous datasets used in machine learning: the Modified National Institute of Standards and Technology database (aka the MNIST database). The MNIST database contains 70,000 images of handwritten digits, like a handwritten 0, 1, 2, and so on. It's often used by binary classification neural networks. I won't go into all the math and science behind it, but basically, a computer can use MNIST to train its ability to determine if any handwritten digit is a 0, 1, 2, etc.
The postal service benefits a lot from this. I used to think that there was always someone behind the counter who read the address on every single package. For the past few decades though, the USPS has been using Optical Character Readers (OCRs) trained by the MNIST database that can identify the recipient's name and address on a package. Of course, the MNIST database has its limitations. In my opinion, most of the handwritten digits are really legible and neat, so OCRs might have trouble with messy handwriting. However, OCRs optimize over time; if it encounters a messy digit that it can't classify, it will output a "guess". A human will read it afterwards and tell the machine if it was correct, allowing the OCR to learn from its mistakes. So if your handwriting is particularly illegible, you're actually helping the computer improve!
Life in Arlington:
When I used to live in the Arlington area, it was pretty much a desolate suburb right outside of DC. Nowadays, it's an up-and-coming city. I would even argue it's already upped and come. During my internship, I'm living in a super gentrified neighborhood in the center of Arlington called Ballston. There are so many upscale restaurants and cafes, tons of entertainment options (including an adult version of Chuck E. Cheese), and every building is either a corporate office or a condo. It's a great place for young professionals, and everyone I've met here is between 20 and 40 years old. Families tend to move towards the suburbs as kids get older. I've definitely been taking advantage of all the incredible food here even if it's a bit pricey. The Army's paying for my meals, so I try not to care too much.
Thank you all for tuning in to my blog so far!
Week 0.5: print("Hello World.")
May 5, 2021
Hi everyone! Welcome to my blog!
I'm Euwan, a rising sophomore and Army ROTC cadet at Duke, and this summer, I'll be participating in the NSERC internship in-person. While interning, I'm placed on temporary active duty, meaning that the Army is paying for all my meals, travel, lodging, and stipends. I'd say it's a pretty good deal.
The description of my internship is a bit overcomplicated (not surprising considering it's organized by the US Department of Defense). The NSERC internship is run by the Nuclear Science and Engineering Research Center of the Defense Threat Reduction Agency (a DoD agency). DTRA assigned me to intern at Weston Geophysical, a group within Applied Research Associates (ARA) and one of the DoD's corporate partners located in Massachusetts, but I'm physically working at ARA's Arlington Division. In other words, I'm working with Weston Geophysical but am stationed in Arlington. Weston Geophysical performs a lot of research on seismic monitoring, which of course has several national security implications. In the future, an advanced enough seismic monitoring program could detect seismic movement, determine whether it was an earthquake or a nuclear explosion, and accurately locate its origin. With technology like that, national security agencies could quickly respond to nuclear threats.
So what am I doing here? The NSERC internship is catered specifically to my interests and goals, so as someone who still doesn't know what she'll major in, I wanted to use NSERC to explore an unfamiliar field. A few months ago, I took a Winter Breakaway course called Artificial Intelligence For Everyone, where I learned the concepts and applications of several machine learning mechanisms. While I'm definitely not a programmer or even a STEM-strong student, I was so fascinated by how scientists were able to code these programs and apply them to everyday problems. Duke itself is accomplishing incredible things with AI! With Weston Geophysical, I'm learning Python (with which I've had zero experience) so that I can code my own neural network from scratch that will (hopefully) classify various seismic and acoustic signals such as earthquakes, chirps, or nuclear explosions.
In the last three days, I've taught myself all the basics of Python, enough linear algebra to understand how neural networks function, and started working through a textbook with my mentor Mr.Bolton. This is probably the most math I have ever done in my life, but everyone at Weston has been incredibly patient with me. For the next four weeks, I'll give a rundown on the coding progress I've made, an existing or potential application for AI, and my overall experience living in Arlington. I actually used to live here when I was younger, and I've got to say it's changed a lot. More to come about that!
Thank you all for tuning in, and I'll talk to you soon.
I also encourage you all to follow along with my coding and AI experiences. Duke's Center for Computational Thinking has so many resources that have greatly helped me, including a Coursera and an upcoming machine learning summer school that I'll also be attending.