Skip to content

code examples in Python / Theano / Blocks / Foxhound to go along with my blog post "Neural Image Captioning for Mortals"

License

Notifications You must be signed in to change notification settings

youralien/image-captioning-for-mortals

Folders and files

NameName
Last commit message
Last commit date

Latest commit

 

History

37 Commits
 
 
 
 
 
 
 
 

Repository files navigation

image-captioning-for-mortals

From Part 2 of my medium post titled "Neural Image Captioning for Mortals"...

We’re talking about how mortals (i.e. the author, an undergraduate intern) can implement machine learning models solving frontier-AI problems, like automatically generating captions to describe scenes in images.

These are the mini-projects to which I divided the problem into:

  1. Rating how relevant an image and caption are to each other (Part 1)
  2. Given an image of a handwritten digit, generating the word to describe it character-by-character (i.e. “z-e-r-o”) (Part 2)
  3. Given a natural scene photo, generating the sentence to describe it word-by-word (Part 2)

The most important piece I’m trying to get across is that that I would not have had the knowledge, tools, or resources to even begin building these models without the help of brilliant, generous people sharing their ideas, code, and models. So, with that perspective in mind, let’s get to it!

About

code examples in Python / Theano / Blocks / Foxhound to go along with my blog post "Neural Image Captioning for Mortals"

Resources

License

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages