楼主: oliyiyi
2581 8

KNN(示例代码 python 和 R) [推广有奖]

版主

已卖:2997份资源

泰斗

1%

还不是VIP/贵宾

-

TA的文库  其他...

计量文库

威望
7
论坛币
27300 个
通用积分
31674.3271
学术水平
1454 点
热心指数
1573 点
信用等级
1364 点
经验
384134 点
帖子
9629
精华
66
在线时间
5508 小时
注册时间
2007-5-21
最后登录
2025-7-8

初级学术勋章 初级热心勋章 初级信用勋章 中级信用勋章 中级学术勋章 中级热心勋章 高级热心勋章 高级学术勋章 高级信用勋章 特级热心勋章 特级学术勋章 特级信用勋章

楼主
oliyiyi 发表于 2017-9-12 15:57:04 |AI写论文

+2 论坛币
k人 参与回答

经管之家送您一份

应届毕业生专属福利!

求职就业群
赵安豆老师微信:zhaoandou666

经管之家联合CDA

送您一个全额奖学金名额~ !

感谢您参与论坛问题回答

经管之家送您两个论坛币!

+2 论坛币

本帖隐藏的内容

KNN (K- Nearest Neighbors)

It can be used for both classification and regression problems. However, it is more widely used in classification problems in the industry. K nearest neighbors is a simple algorithm that stores all available cases and classifies new cases by a majority vote of its k neighbors. The case being assigned to the class is most common amongst its K nearest neighbors measured by a distance function.

These distance functions can be Euclidean, Manhattan, Minkowski and Hamming distance. First three functions are used for continuous function and fourth one (Hamming) for categorical variables. If K = 1, then the case is simply assigned to the class of its nearest neighbor. At times, choosing K turns out to be a challenge while performing KNN modeling.

More: Introduction to k-nearest neighbors : Simplified.

KNN can easily be mapped to our real lives. If you want to learn about a person, of whom you have no information, you might like to find out about his close friends and the circles he moves in and gain access to his/her information!

Things to consider before selecting KNN:

  • KNN is computationally expensive
  • Variables should be normalized else higher range variables can bias it
  • Works on pre-processing stage more before going for KNN like outlier, noise removal
Python Code
  1. #Import Library
  2. from sklearn.neighbors import KNeighborsClassifier
  3. #Assumed you have, X (predictor) and Y (target) for training data set and x_test(predictor) of test_dataset
  4. # Create KNeighbors classifier object model
  5. KNeighborsClassifier(n_neighbors=6) # default value for n_neighbors is 5
  6. # Train the model using the training sets and check score
  7. model.fit(X, y)
  8. #Predict Output
  9. predicted= model.predict(x_test)
复制代码

R Codelibrary(knn)

  1. x <- cbind(x_train,y_train)
  2. # Fitting model
  3. fit <-knn(y_train ~ ., data = x,k=5)
  4. summary(fit)
  5. #Predict Output
  6. predicted= predict(fit,x_test)
复制代码


二维码

扫码加我 拉你入群

请注明:姓名-公司-职位

以便审核进群资格,未注明则拒绝

关键词:python knn introduction information Computation

缺少币币的网友请访问有奖回帖集合
https://bbs.pinggu.org/thread-3990750-1-1.html

沙发
hyq2003 发表于 2017-9-12 16:07:01

藤椅
ztzxx 学生认证  发表于 2017-9-12 16:08:03
学习学习

板凳
钱学森64 发表于 2017-9-12 16:45:45
谢谢分享

报纸
MouJack007 发表于 2017-9-12 19:29:58
谢谢楼主分享!

地板
MouJack007 发表于 2017-9-12 19:31:08

7
ekscheng 发表于 2017-9-12 23:31:19

8
minixi 发表于 2017-9-13 10:54:59
谢谢分享

9
piiroja 发表于 2020-9-20 11:30:19
thx for sharing~

您需要登录后才可以回帖 登录 | 我要注册

本版微信群
加好友,备注jltj
拉您入交流群
GMT+8, 2026-1-25 08:15