楼主: Lisrelchen
2096 12

Keras: Deep Learning library [推广有奖]

  • 0关注
  • 62粉丝

VIP

已卖:4194份资源

院士

67%

还不是VIP/贵宾

-

TA的文库  其他...

Bayesian NewOccidental

Spatial Data Analysis

东西方数据挖掘

威望
0
论坛币
50288 个
通用积分
83.6306
学术水平
253 点
热心指数
300 点
信用等级
208 点
经验
41518 点
帖子
3256
精华
14
在线时间
766 小时
注册时间
2006-5-4
最后登录
2022-11-6

楼主
Lisrelchen 发表于 2017-4-26 10:31:20 |AI写论文

+2 论坛币
k人 参与回答

经管之家送您一份

应届毕业生专属福利!

求职就业群
赵安豆老师微信:zhaoandou666

经管之家联合CDA

送您一个全额奖学金名额~ !

感谢您参与论坛问题回答

经管之家送您两个论坛币!

+2 论坛币
Keras: Deep Learning library for Theano and TensorFlowYou have just found Keras.

Keras is a high-level neural networks API, written in Python and capable of running on top of either TensorFlow orTheano. It was developed with a focus on enabling fast experimentation. Being able to go from idea to result with the least possible delay is key to doing good research.

Use Keras if you need a deep learning library that:

  • Allows for easy and fast prototyping (through user friendliness, modularity, and extensibility).
  • Supports both convolutional networks and recurrent networks, as well as combinations of the two.
  • Runs seamlessly on CPU and GPU.

Read the documentation at Keras.io.

Keras is compatible with: Python 2.7-3.5.



二维码

扫码加我 拉你入群

请注明:姓名-公司-职位

以便审核进群资格,未注明则拒绝

关键词:Learning earning Library Learn BRARY developed learning possible library running

本帖被以下文库推荐

沙发
Lisrelchen 发表于 2017-4-26 10:31:45
  1. Getting started with the Keras Sequential model

  2. The Sequential model is a linear stack of layers.

  3. You can create a Sequential model by passing a list of layer instances to the constructor:

  4. from keras.models import Sequential
  5. from keras.layers import Dense, Activation

  6. model = Sequential([
  7.     Dense(32, input_shape=(784,)),
  8.     Activation('relu'),
  9.     Dense(10),
  10.     Activation('softmax'),
  11. ])
复制代码

藤椅
Lisrelchen 发表于 2017-4-26 10:32:21
  1. Specifying the input shape

  2. The model needs to know what input shape it should expect. For this reason, the first layer in a Sequential model (and only the first, because following layers can do automatic shape inference) needs to receive information about its input shape. There are several possible ways to do this:

  3. Pass an input_shape argument to the first layer. This is a shape tuple (a tuple of integers or None entries, where None indicates that any positive integer may be expected). In input_shape, the batch dimension is not included.
  4. Some 2D layers, such as Dense, support the specification of their input shape via the argument input_dim, and some 3D temporal layers support the arguments input_dim and input_length.
  5. If you ever need to specify a fixed batch size for your inputs (this is useful for stateful recurrent networks), you can pass a batch_size argument to a layer. If you pass both batch_size=32 and input_shape=(6, 8) to a layer, it will then expect every batch of inputs to have the batch shape (32, 6, 8).
  6. As such, the following snippets are strictly equivalent:

  7. model = Sequential()
  8. model.add(Dense(32, input_shape=(784,)))
  9. model = Sequential()
  10. model.add(Dense(32, input_dim=784))
复制代码

板凳
Lisrelchen 发表于 2017-4-26 10:33:01
  1. Compilation

  2. # For a multi-class classification problem
  3. model.compile(optimizer='rmsprop',
  4.               loss='categorical_crossentropy',
  5.               metrics=['accuracy'])

  6. # For a binary classification problem
  7. model.compile(optimizer='rmsprop',
  8.               loss='binary_crossentropy',
  9.               metrics=['accuracy'])

  10. # For a mean squared error regression problem
  11. model.compile(optimizer='rmsprop',
  12.               loss='mse')

  13. # For custom metrics
  14. import keras.backend as K

  15. def mean_pred(y_true, y_pred):
  16.     return K.mean(y_pred)

  17. model.compile(optimizer='rmsprop',
  18.               loss='binary_crossentropy',
  19.               metrics=['accuracy', mean_pred])
复制代码

报纸
Lisrelchen 发表于 2017-4-26 10:33:48
  1. # For a single-input model with 2 classes (binary classification):

  2. model = Sequential()
  3. model.add(Dense(32, activation='relu', input_dim=100))
  4. model.add(Dense(1, activation='sigmoid'))
  5. model.compile(optimizer='rmsprop',
  6.               loss='binary_crossentropy',
  7.               metrics=['accuracy'])

  8. # Generate dummy data
  9. import numpy as np
  10. data = np.random.random((1000, 100))
  11. labels = np.random.randint(2, size=(1000, 1))

  12. # Train the model, iterating on the data in batches of 32 samples
  13. model.fit(data, labels, epochs=10, batch_size=32)
复制代码

地板
Lisrelchen 发表于 2017-4-26 10:34:23
  1. # For a single-input model with 10 classes (categorical classification):

  2. model = Sequential()
  3. model.add(Dense(32, activation='relu', input_dim=100))
  4. model.add(Dense(10, activation='softmax'))
  5. model.compile(optimizer='rmsprop',
  6.               loss='categorical_crossentropy',
  7.               metrics=['accuracy'])

  8. # Generate dummy data
  9. import numpy as np
  10. data = np.random.random((1000, 100))
  11. labels = np.random.randint(10, size=(1000, 1))

  12. # Convert labels to categorical one-hot encoding
  13. one_hot_labels = keras.utils.to_categorical(labels, num_classes=10)

  14. # Train the model, iterating on the data in batches of 32 samples
  15. model.fit(data, one_hot_labels, epochs=10, batch_size=32)
复制代码

7
Lisrelchen 发表于 2017-4-26 10:35:11
  1. Multilayer Perceptron (MLP) for multi-class softmax classification:

  2. from keras.models import Sequential
  3. from keras.layers import Dense, Dropout, Activation
  4. from keras.optimizers import SGD

  5. # Generate dummy data
  6. import numpy as np
  7. x_train = np.random.random((1000, 20))
  8. y_train = keras.utils.to_categorical(np.random.randint(10, size=(1000, 1)), num_classes=10)
  9. x_test = np.random.random((100, 20))
  10. y_test = keras.utils.to_categorical(np.random.randint(10, size=(100, 1)), num_classes=10)

  11. model = Sequential()
  12. # Dense(64) is a fully-connected layer with 64 hidden units.
  13. # in the first layer, you must specify the expected input data shape:
  14. # here, 20-dimensional vectors.
  15. model.add(Dense(64, activation='relu', input_dim=20))
  16. model.add(Dropout(0.5))
  17. model.add(Dense(64, activation='relu'))
  18. model.add(Dropout(0.5))
  19. model.add(Dense(10, activation='softmax'))

  20. sgd = SGD(lr=0.01, decay=1e-6, momentum=0.9, nesterov=True)
  21. model.compile(loss='categorical_crossentropy',
  22.               optimizer=sgd,
  23.               metrics=['accuracy'])

  24. model.fit(x_train, y_train,
  25.           epochs=20,
  26.           batch_size=128)
  27. score = model.evaluate(x_test, y_test, batch_size=128)
复制代码

8
Lisrelchen 发表于 2017-4-26 10:35:29
  1. MLP for binary classification:

  2. import numpy as np
  3. from keras.models import Sequential
  4. from keras.layers import Dense, Dropout

  5. # Generate dummy data
  6. x_train = np.random.random((1000, 20))
  7. y_train = np.random.randint(2, size=(1000, 1))
  8. x_test = np.random.random((100, 20))
  9. y_test = np.random.randint(2, size=(100, 1))

  10. model = Sequential()
  11. model.add(Dense(64, input_dim=20, activation='relu'))
  12. model.add(Dropout(0.5))
  13. model.add(Dense(64, activation='relu'))
  14. model.add(Dropout(0.5))
  15. model.add(Dense(1, activation='sigmoid'))

  16. model.compile(loss='binary_crossentropy',
  17.               optimizer='rmsprop',
  18.               metrics=['accuracy'])

  19. model.fit(x_train, y_train,
  20.           epochs=20,
  21.           batch_size=128)
  22. score = model.evaluate(x_test, y_test, batch_size=128)
复制代码

9
Lisrelchen 发表于 2017-4-26 10:36:14
  1. VGG-like convnet:

  2. import numpy as np
  3. import keras
  4. from keras.models import Sequential
  5. from keras.layers import Dense, Dropout, Flatten
  6. from keras.layers import Conv2D, MaxPooling2D
  7. from keras.optimizers import SGD

  8. # Generate dummy data
  9. x_train = np.random.random((100, 100, 100, 3))
  10. y_train = keras.utils.to_categorical(np.random.randint(10, size=(100, 1)), num_classes=10)
  11. x_test = np.random.random((20, 100, 100, 3))
  12. y_test = keras.utils.to_categorical(np.random.randint(10, size=(20, 1)), num_classes=10)

  13. model = Sequential()
  14. # input: 100x100 images with 3 channels -> (100, 100, 3) tensors.
  15. # this applies 32 convolution filters of size 3x3 each.
  16. model.add(Conv2D(32, (3, 3), activation='relu', input_shape=(100, 100, 3)))
  17. model.add(Conv2D(32, (3, 3), activation='relu'))
  18. model.add(MaxPooling2D(pool_size=(2, 2)))
  19. model.add(Dropout(0.25))

  20. model.add(Conv2D(64, (3, 3), activation='relu'))
  21. model.add(Conv2D(64, (3, 3), activation='relu'))
  22. model.add(MaxPooling2D(pool_size=(2, 2)))
  23. model.add(Dropout(0.25))

  24. model.add(Flatten())
  25. model.add(Dense(256, activation='relu'))
  26. model.add(Dropout(0.5))
  27. model.add(Dense(10, activation='softmax'))

  28. sgd = SGD(lr=0.01, decay=1e-6, momentum=0.9, nesterov=True)
  29. model.compile(loss='categorical_crossentropy', optimizer=sgd)

  30. model.fit(x_train, y_train, batch_size=32, epochs=10)
  31. score = model.evaluate(x_test, y_test, batch_size=32)
复制代码

10
Lisrelchen 发表于 2017-4-26 10:36:46
  1. Sequence classification with LSTM:

  2. from keras.models import Sequential
  3. from keras.layers import Dense, Dropout
  4. from keras.layers import Embedding
  5. from keras.layers import LSTM

  6. model = Sequential()
  7. model.add(Embedding(max_features, output_dim=256))
  8. model.add(LSTM(128))
  9. model.add(Dropout(0.5))
  10. model.add(Dense(1, activation='sigmoid'))

  11. model.compile(loss='binary_crossentropy',
  12.               optimizer='rmsprop',
  13.               metrics=['accuracy'])

  14. model.fit(x_train, y_train, batch_size=16, epochs=10)
  15. score = model.evaluate(x_test, y_test, batch_size=16)
复制代码

您需要登录后才可以回帖 登录 | 我要注册

本版微信群
加好友,备注jltj
拉您入交流群
GMT+8, 2026-1-3 01:39