- °ÀDZ¸¼º
- (20°) Àüü : 13½Ã°£ 1ºÐ|Æò±Õ : 39ºÐ2ÃÊ
- ÀÌ¿ë±â°£ / °»ç
- 2°³¿ù / ¾ËÁö¿À R&D [IT]
- Áõºù°¡´É
- ¼ö·áÁõ, ¼ö°Áõ, ÇнÀÁøµµ
- ¼ö°·á
- 450,000¿ø

- 225,000¿ø
¼ö° Àü ÀÚÁÖ ¹¯´Â Áú¹®
±³Àç ¾øÀÌ? Ãʺ¸ÀÚµµ °¡´É ÇѰ¡¿ä?
±³Àç ¾øÀ̵µ µ¿¿µ»ó°ú ½Ç½À ÀڷḸÀ¸·Î ÇнÀÇÒ ¼ö ÀÖÀ¸¸ç, Ãʺ¸ÀÚµµ ÀÌÇØÇÏ°í µû¶ó¿Ã ¼ö ÀÖµµ·Ï ¼³°èµÈ °ÀÇÀÔ´Ï´Ù.
¾ËÁö¿ÀÀÇ °ÀÇÆ¯Â¡Àº ¹«¾ùÀΰ¡¿ä?
¾ËÁö¿À °ÀÇ´Â ´Ü¼ø ÃÔ¿µº»ÀÌ ¾Æ´Ï¶ó, Àü¹® ÆíÁýÀ¸·Î Çٽɸ¸ ´ã¾Æ ÇнÀ È¿À²À» ³ôÀÎ °ÀÇÀÔ´Ï´Ù.
ÇÁ·Î±×·¥Àº ¾î¶»°Ô ±¸Çϳª¿ä?
¾ËÁö¿À ´Â ¿ø°ÝÆò»ý±³À°¿ø À¸·Î, ÇÁ·Î±×·¥¿¡ ´ëÇÑ Á¤º¸´Â ¾Ë¼ö ¾ø½À´Ï´Ù.
¼ö°»ý ¿©·¯ºÐ²² °¨»çµå¸®¸ç, ³³ºÎÇϽмö°·á ÀϺδ ¸Å¿ù »çȸ ȯ¿ø Ȱµ¿¿¡ »ç¿ëµË´Ï´Ù. ƯÈ÷ ¼Ò¿ÜµÈ ÀÌ¿ôµé¿¡°Ô ¶ó¸éÀ» ±âºÎÇϸç ÀÛÀº ³ª´®À» À̾°í ÀÖ½À´Ï´Ù. ÀÚ¼¼ÇÑ ³»¿ëÀº ¾Æ·¡ ¸µÅ©¿¡¼ È®ÀÎÇÏ½Ç ¼ö ÀÖ½À´Ï´Ù. ¾ËÁö¿À »çȸȯ¿ø Ȱµ¿
-
01.34ºÐ
Data Preparation
Data Load, Data Summarization, Data Visualization, Data Split ¹æ¹ý¿¡ ´ëÇØ ÇнÀÇÕ´Ï´Ù. (Python Lib: os, tarfile, urllib, numpy, pandas, matplitlib.pyplot µî)
¸Ó½Å·¯´×°Á¿¡¼´Â?/ÇÊ¿äÇÑ ÇÁ·Î±×·¥ ¹× ¶óÀ̺귯¸® ¼³Ä¡/Data Preparation ½Ç½À - ÆÄÀÏ ´Ù¿î·Îµå ¹× ¾ÐÃàÇ®±â/Data Load/Data Summarization/Data Visualization/Data Split/Data Split-Train,Test ºÐ¸®Çϱâ
-
02.46ºÐ
Data Understanding
(Expansion) Data Split, (Comparison) Data Distribution, Data Visualization - Scatter, Correlation Analysis ¹æ¹ý¿¡ ´ëÇØ ÇнÀÇÕ´Ï´Ù.
ÀÎÆ®·Î/HousingºÐ¼®/train_test_splitȰ¿ëÇϱâ/StratifiedShuffledSplit»ç¿ëÇϱâ/Error»çÇ×(Income_cat¿¡´ëÇÏ¿©)/Error»çÇ×(Income_cat¿¡´ëÇÏ¿©)2/Comparison(µ¥ÀÌÅͺñ±³)/Data_set °¡½ÃÈ/bad_visualization/CorrelationºÐ¼®/Correlation_matrix
-
03.42ºÐ
Data Preprocessing
Data Cleansing, Categorical Data, Pipeline ó¸® ¹æ¹ý¿¡ ´ëÇØ ÇнÀÇÕ´Ï´Ù.
ºÒ·¯¿À±â/impute Ȱ¿ëÇϱâ/impute °ËÁõÇϱâ/Lable 󸮹æ¹ý /onehot encoding ÀÌ¿ëÇϱâ/Pipeline/Attribute_adder»ý¼º/StandardScalar ¾Ë¾Æº¸±â
-
04.52ºÐ
ML model(end-to-end)
Model Selection, Model Training, Model Tuning¿¡ ´ëÇØ End-to-end·Î ½Ç½ÀÇÏ¿©, ±â°èÇнÀ¿¡ ´ëÇÑ Àü¹ÝÀûÀÎ ÀÌÇØµµ¸¦ ³ôÀÌ´Â ÇнÀÀ» ¼öÇàÇÕ´Ï´Ù.
µ¥ÀÌÅÍ ºÒ·¯¿À±â/ÈÆ·Ç¼Â Å×½ºÆ®¼Â ºÐ¸®/µ¥ÀÌÅÍÁ¤Á¦[Data preparation, Data Processing]/µ¥ÀÌÅÍÁ¤Á¦2[Data preparation, Data Processing]/Machine Learnring - Linear Regression/Machine Learnring - Mean squre Error, Mean absolute Error/Machine Learnring - Dicision Tree/Machine Learnring - CV/Machine Learnring - CV2/Machine Learnring - Random Forest/Machine Learnring - SVM
-
05.55ºÐ
Classification Part 1
Machine Learning ¾Ë°í¸®Áò Áß ºÐ·ù¿¡ ´ëÇÑ ÇнÀÀ» ¼öÇàÇÕ´Ï´Ù. (Model Training: SGD Classifier, Model Evaluation: Cross Validation + Precision & Recall & F1-score + ROC Curve)
MNIST µ¥ÀÌÅÍ/MNIST µ¥ÀÌÅÍ Ã¼Å©/MNIST µ¥ÀÌÅÍÀÇ PlotȰ¿ë/MNIST µ¥ÀÌÅÍÀÇ PlotȰ¿ë2/clfÀÌ¿ëÇÏ¿© ¿¹ÃøÇϱâ/Cross Validation/Confusion Matrix¸¦ ÀÌ¿ëÇÏ¿© Æò°¡Çϱâ/Precision & Recall Trade-off/Precision & Recall Curve¸¸µé±â/Precision & Recall »çÀÌÀÇ ±×·¡ÇÁ ¸¸µé±â/ROC Curve³ªÅ¸³»±â
-
06.33ºÐ
Classification Part 2
Machine Learning ¾Ë°í¸®Áò Áß ºÐ·ù¿¡ ´ëÇÑ ÇнÀÀ» ¼öÇàÇÕ´Ï´Ù. (Model Building: SGDClassifier, OneVsOneClassifier, ForestClassifier, ErrorVisualization: ConfusionMatrix-Matshow, Multiple Labels Classification)
MNIST µ¥ÀÌÅÍ ºÒ·¯¿À±â/Y lableÀÛ¼º/lable ¿øÀÎÈ®ÀÎ/Random Forest Classifier ¸¸µé±â/ConfusionMatrix È®ÀÎ/ConfusionMatrix °¡½ÃÈ
-
07.34ºÐ
Regression Part 1
Linear Regression¿¡ ´ëÇÑ ÇнÀÀ» ¼öÇàÇÕ´Ï´Ù.(Normal Equation, Batch Gradient Descent, Stochastic Gradient Descent, Minibatch Gradient Descent)
Normal Equation - Á¤±Ô¹æÁ¤½Ä/Linear Regression ±×¸®±â/Batch Gradient Descent - °æ»çÇϰ¹ý/Stochastic Gradient Descent - È®·ü/Minibatch Gradient Descent-¹Ì´Ï ¹èÄ¡°æ»ç
-
08.34ºÐ
Regression Part 2
Other Regression¿¡ ´ëÇÑ ÇнÀÀ» ¼öÇàÇÕ´Ï´Ù.(Polynomial Regression, Logistic Regression)
Polynomial Regression/Polynomial Regression2/Polynomial Regression3/¸ðµ¨ºñ±³Çϱâ/Logistic Regression/Logistic Regression2/Logistic Regression3
-
09.37ºÐ
SVM Part 1
SVM ¸ðµ¨ ÇнÀÀ» ¼öÇàÇÕ´Ï´Ù. (Comparison between Good & Bad model, Large margin classification, Sensitivity to features scale & outliers)
SVM ¸ðµ¨ ºôµùÇϱâ/Linear SVMÀÇ ÁÁÀº¸ðµ¨°ú ³ª»Û¸ðµ¨/Linear SVMÀÇ ÁÁÀº¸ðµ¨°ú ³ª»Û¸ðµ¨2/Large Margin classification/Large Margin classification2/Large Margin classification3/Large Margin classification4
-
10.58ºÐ
SVM Part 2
Nonlinear DatasetÀÇ SVM ¸ðµ¨ ÇнÀÀ» ¼öÇàÇÕ´Ï´Ù. (Polynomial Kerneal, Similarity Characteristic, Gaussian RBF Kernel)
Non-linearÀ̶õ?/Nonlinear Dataset Á¦ÀÛ/Nonlinear Dataset Á¦ÀÛ2/Polynomial Kerneal -´ÙÇׯ¯¼º/Similarity Characteristic/Kerneal Trick ÀÌ¿ë/Gaussian RBF Kernel/Gaussian RBF Kernel - plotÇ¥½Ã/Gaussian RBF Kernel - plotÇ¥½Ã2/SVM¸ðµ¨¸¸µé±â
-
11.30ºÐ
SVM Part 3
SVM ȸ±Í(SVM Regression) ¸ðµ¨ ÇнÀÀ» ¼öÇàÇÕ´Ï´Ù. (Linear SVR, Polynomial Kernal Trick)
import ÀÛ¾÷ ¹× µ¥ÀÌÅÍ »ý¼º/Linear SVR/Linear SVR2/Linear SVR3/Polynomial Kernal Trick/Polynomial Kernal Trick2
-
12.49ºÐ
SVM Part 4
SVM Background¿¡ ´ëÇØ ÇнÀÀ» ¼öÇàÇÕ´Ï´Ù. (Decision Function, Objective Function, Hinge Loss, Training Time)
-
13.54ºÐ
DecisionTree Part 1
Decision Tree¿¡ ´ëÇØ ÇнÀÀ» ¼öÇàÇÕ´Ï´Ù. (Training, Visualization, Prediction Class, Sensitivity, Restriction)
-
14.26ºÐ
DecisionTree Part 2
Regression Decision Tree¿¡ ´ëÇØ ÇнÀÀ» ¼öÇàÇÕ´Ï´Ù. (TreeRegressor, Comparision(DecisionTree: Classification vs. Regression), Restriction)
-
15.34ºÐ
EnsembleLearning Part 1
Ensemble Learning Áß Voting, Bagging ¹æ¹ý¿¡ ´ëÇØ ÇнÀÀ» ¼öÇàÇÕ´Ï´Ù. (Voting Classifier, Bagging Classifier)
-
16.28ºÐ
EnsembleLearning Part 2
Bagging ¹æ¹ýÀÇ ´ëÇ¥ÀûÀÎ ¾Ë°í¸®ÁòÀÎ Random Forest¿¡ ´ëÇØ ÇнÀÀ» ¼öÇàÇÕ´Ï´Ù. (Random Forest, Out-of-bag Evaluation, Feature Importance)
-
17.59ºÐ
EnsembleLearning Part 3
Boosting ¹æ¹ý¿¡ ´ëÇØ ÇнÀÀ» ¼öÇàÇÕ´Ï´Ù. (Ada Boosting, Gradient Boosting)
-
18.27ºÐ
Dimension Reduction Part 1
Â÷¿øÃà¼Ò ¹æ¹ý¿¡ ´ëÇØ ÇнÀÀ» ¼öÇàÇÕ´Ï´Ù. (Projection with PCA, Manifold Learning)
-
19.24ºÐ
Dimension Reduction Part 2
Â÷¿øÃà¼Ò ¹æ¹ý¿¡ ´ëÇØ ÇнÀÀ» ¼öÇàÇÕ´Ï´Ù. (PCA, MNIST compression, Incremental PCA)
-
20.25ºÐ
Dimension Reduction Part 3
Â÷¿øÃà¼Ò ¹æ¹ý¿¡ ´ëÇØ ÇнÀÀ» ¼öÇàÇÕ´Ï´Ù. (Kernal PCA, LLE(Locally Linear Embedding))