Welch Labs Now Offering Computer Vision and Machine Learning Consulting Services

It’s an exciting time to be in machine learning and computer vision. A rapidly advancing state of the art is creating new ways to solve old problems and bringing about entirely new business opportunities.

Along with these new opportunities comes new challenges.

1.    Keeping up with the latest developments is really hard. ~35k new computer science papers published on arxiv in 2018 alone.

2.    Hype/noise can make it difficult to see where the real opportunities are – what’s really feasible and what isn’t? Where is real value being created, and what hot topics will fade away in the coming years?

3.    The tooling is changing rapidly – Tensorflow will be just 4 years old this year, PyTorch will be 3. No one on the planet has more than 5 years of experience in these tools.

What machine learning or computer vision innovations will drive value for your business? How will you go about getting the solutions out of the lab and into production?

Welch Labs is uniquely positioned to help your business tackle these challenges and capitalize on new opportunities. We have deep expertise in modern algorithm design, implementation, testing, and deployment. We work closely with your team to identify the truly impactful innovations for your new and existing business applications; build, test, and deploy solutions, and clearly communicate processes and results.

Have a tough machine learning or computer vision problem? We would love to hear more about it – reach us here.

No New Series (Yet)

Back in April I made the audacious claim that I was going to create a new series, and that it would launch in July. Well, July has come and almost gone, with no new Welch Labs series to show for it.

I've made some good progress, but all the pieces just haven't come together yet.

I couldn't be happier with the topic - waves, applications of complex numbers, and Euler's formula cover an fascinating and impossibly large swath of physics. Thus far I've honed in on a fascinating narrative that I'm thinking the series will turn on: the vibrating string. The vibrating string was at the center of the development of an astounding amount of mathematics, and offers some really beautiful scientific and mathematical mysteries.

A lesson I'm quickly learning about my topic of choice is the huge range of mathematical rigor it covers - literally from elementary to graduate school. For this reason I'm considering breaking up this topic across several short series (ideally 3-5 episodes each) - I'm thinking that series one will require very little mathematical background, series two will require knowledge of pre-calculus and complex numbers, and series three will require calculus. I'm hoping to launch the first series in August or September.

Finally, I'd like to share some of the interesting resources I've come across thus far.

Great Books

A Student's Guide to Waves

Good Books

Rameau and Musical Thought in the Enlightenment (Cambridge Studies in Music Theory and Analysis) 

A Course in Mathematical Methods for Physicists

Origins in Acoustics: The Science of Sound from Antiquity to the Age of Newton

A Source Book in Mathematics, 1200-1800 (Princeton Legacy Library)

The Language of Physics: The Calculus and the Development of Theoretical Physics in Europe, 1750–1914

Imaginary Numbers Are Real [Part 13: Riemann Surfaces]

It took over a year, but the Imaginary Numbers Series is finally complete. By far the most labor intensive parts were part 1 and part 13. When I began the series I had no idea where it would end up. I originally planned on 6 parts, but the deeper I got into imaginary numbers, the cooler things got - and I just couldn't deal with telling an incomplete story. 

I couldn't be happier with where the series ended up - I'm so happy I was able to talk about Riemann Surfaces. I'm sure a mathematician or two will take issue with my presentation (there's a reason Riemann Surface are a graduate level mathematics topic!), but I hope I was able to give a broad audience a taste for these beautiful mathematical structures without oversimplifying the meaning out of things.

I wanted to share a few of the visualizations I used in Part 13. I used the wonderful visualization tool plotly for all 3D graphics. Here are a few visualization from Part 13:

Riemann Surface for \(w=\sqrt{z}\) 

Paths on Riemann Surface. This one's a bit slow, it takes lots of points to make that path!

3D surface plot from opening scene. 

Thanks for watching!

Imaginary Numbers Are Real [Part 12: Riemann's Solution]

In parts 12 and 13, we get to spend some time inside the head of arguably the most important mathematician of the 19th Century - Bernhard Riemann. We're going to begin our next episode by creating a Riemann Surface from our 2 w-planes. This surface will have the wonderful property of making our colored path continuous! While the full theory of Riemann surfaces is far more complex than we can cover here, the baby Riemann Surface we'll create will be sufficient to elegantly visualize our 4 dimensional multifunction, and explain the weird path behavior we saw in back in part 11. You can download a pdf version of the w-planes here.

Your very own w-planes to cutout!

What's Next for Welch Labs?

Welch Labs has been inactive for 3 months. This will be changing soon. I’ll be finishing up the Imaginary Numbers Series, and am excited to premiere a new series at ML CONF in New York this April. (Use the code Welch18 for $45 off your ticket!)

My vision for Welch Labs in 2016 is continue making the highest-quality educational content possible. I’m excited to create new Machine Learning content this year, as well as math content in the vein of my Imaginary Numbers series.

If you’re interested in the Machine Learning content - please read on, I’d love to hear what you think. If you’re more interested in math content, I’d also love to hear what you think! Click here to see what math topics are coming up.

The next Machine Learning series from Welch Labs will cover Decision Trees and Mutual Information. This direction came about for a few reasons:

1. As a strong Intro to Machine Learning. I love making content that appeals to and is understandable by people from all education levels, high school to graduate school. Decision trees are perhaps the "original" machine learning tool - they are easy to understand, and provide great "bang for the buck" - making for a great introduction to machine learning.

The Five Tribes of Machine Learning According to Pedro Domingos. Image from The Master Algorithm. 

2. Stealing from The Master Algorithm. If you're interesting in ML and haven't read Pedro Domingos' book - The Master Algorithm - I highly recommend it. Certainly the best "pop" machine learning book I've read. Pedro weaves a wide variety of Machine Learning approaches into a single narrative in clear and approachable ways. I love this stuff - it's the kind of history and context you can't get from reading technical resources. Pedro structures his book by dividing Machine Learning into 5 tribes.  While this is clearly a simplification of a complex field, I think it is a very useful one. I think it's so useful in fact, I'll be stealing it :). I'll be roughly following Pedro's path through the world of Machine Learning for the next few series - starting with Decision Trees. Pedro seemed like a nice enough guy when I met him at ML CONF Atlanta - so I'm hoping he won’t be to mad. 

3. Decision Trees = The Most Popular ML on the Planet. Although no longer among the newest and hottest algorithms out there, trees are incredibly widely used across many diverse applications. The success of trees is likely a result of many factors - but I believe the most important is the simplicity of the resulting models. In my professional work, trees have beat out other models again and again, simply because I can clearly explain the resulting algorithm to anyone. Try that with a Neural Network!

Research Time

The next couple weeks are all about research. I'm reading through Ross Quinlan's book as well as CART. I'm working to develop a deep understanding of the history and fundamentals and will build up from there. Along the way I'm looking for interesting resources on and applications of decision trees. I would like to cover a simple example in detail to put emphasis on the tools and techniques, but I'd also like to spend some time exploring how Decision Trees are used in big, challenging problems. Mutual information and entropy are big parts of most Decision Tree algorithms, and are clearly topics that could make up their own series - to keep my scope in check, I'll be investigating these through the lens of decision trees.

Help!

As I begin to shape the series, I'd love to hear from you. What do you think is interesting here? What are some cool/interesting applications of decision trees? What are some resources you've found helpful? What would you like to know about decision trees and mutual information? I'd love to hear what you think - please let me know what you think in the comment section below. Thank you!