Seam Carving | Week 2, lecture 7 | 18.S191 MIT Fall 2020

preview_player
Показать описание
An algorithm for intelligently resizing images and an introduction to dynamic programming.

Here are the notebooks used, which were originally written by Shashi Gowda:

Contents
00:00 Welcome!
00:10 What is Seam Carving?
03:37 Finding edges with the gradient
12:32 Optimal paths with dynamic programming
24:06 Other images and uses

Рекомендации по теме
Комментарии
Автор

this is the best thing that has happened during the pandemic. I teach seam carving as a PhD student, but now I feel useless :D

SinnohStarly
Автор

Combine a few simple algorithms - in this case edge detection and minimum energy path search - and what you get is pure black magic.
That is why I love computer science!

ProjectPhysX
Автор

I have my own classes, but I’m watching this series as well for Grant.

NEMountainG
Автор

Grant's lectures are always pure gold. We need to keep the activities in the comments so the YouTube algorithms bless this video. It will help spread knowledge and will definitely make the world a better place.

AV-brbm
Автор

How can he be very good at storytelling this kind of complicated and multilayered concepts/techniques. A genius.

barrycavin
Автор

i love the dynamic programming example.. changin from exponential order to multiplication just by reversing the way we do computation.. that is absolutely awesome..

zinxys
Автор

This is the best explanation of magic of content aware shrinking. Thank you! Спасибо!

oghry
Автор

People waste time on Netflix when you have Grant around. Crazy.

kindoblue
Автор

If you can, please do more of these computer science applied math on the main channel. They are really interesting, and way easier to remember thanks to your way of explaining. Also seeing your face while explaining, like you do here, I think it's better then just having the graphics full screen

nonhonome
Автор

Great presentation, Grant! So clear and concise, despite the depth. Makes it look easy and natural. Really good!

SergioRMuniz
Автор

"It picks up on the edginess of Mario" --- Grant Sanderson 2020

ancbi
Автор

This is just amazing! Grant not only knows a lot but explains in a way that makes it very simple

VicenteSchmitt
Автор

Wow I've never seen anyone to explain dynamic programming simpler!

MrNightLifeLover
Автор

That's why I love algorithms, they are made by most genuine persons.

ManthaarJanyaro
Автор

This must be grants lowest views but thanks a lot for putting such content and making it accessible

adityachk
Автор

Isn't the idea of using a seam at every step itself a greedy algorithm? Just based on my intuition, it should be possible to carve a seam that makes other future seams significantly less optimal, perhaps by cutting half of a seam that could be a future option.

Of course, the greedy algorithm seems to work well enough, but perhaps there's a DP approach that could let us choose better seams by looking into potential future seams.

abhchow
Автор

After the first round of computing least energy to bottom for each cell, you can get the minimum energy path by just following the lowest energy cells from bottom to top. That is faster than starting from the top cells and performing least energy path computations.

johnhausmann
Автор

This is awesome, thanks. Especially the bit at the end - great potential for mischief!

maxmetpt
Автор

Nice video. Here's an alternative justification of three rows in the Sobel filter. 1) taking a derivation requires at least two points per output, so the dimension of the convolution will be one less. 2) if you want a gradient matrix, each slice (partial derivative) should be the same size so you need to average neigboring pixels in every other dimension. 3) if you average the derivativesto reduce noise, you should also average the (averaged) dimensions for the same reasons above. This gives you the vertical weights of [1; 2; 1].

christophercrawford
Автор

At 22:48 why are we adding j==1 to dir while saving min item indices

piyushsingh
welcome to shbcf.ru