楼主: Lisrelchen
1977 3

【nnet-ts】Neural network architecture for time series forecasting. [推广有奖]

  • 0关注
  • 62粉丝

VIP

院士

67%

还不是VIP/贵宾

-

TA的文库  其他...

Bayesian NewOccidental

Spatial Data Analysis

东西方数据挖掘

威望
0
论坛币
50164 个
通用积分
81.5628
学术水平
253 点
热心指数
300 点
信用等级
208 点
经验
41518 点
帖子
3256
精华
14
在线时间
766 小时
注册时间
2006-5-4
最后登录
2022-11-6

+2 论坛币
k人 参与回答

经管之家送您一份

应届毕业生专属福利!

求职就业群
赵安豆老师微信:zhaoandou666

经管之家联合CDA

送您一个全额奖学金名额~ !

感谢您参与论坛问题回答

经管之家送您两个论坛币!

+2 论坛币
nnet-ts

Neural network architecture for time series forecasting.

Requirements and installation

This packages relies heavily on numpy, scipy, pandas, theano and keras. Check on their repositories how to install them first.

Then, simply fetch the package from PyPI.

sudo pip install nnet-tsUsage

Using Box & Jenkins classical air passenger data.

  1. from nnet_ts import *

  2. time_series = np.array(pd.read_csv("AirPassengers.csv")["x"])
复制代码


Create a TimeSeriesNnet object and specify each layer size and activation function.

  1. neural_net = TimeSeriesNnet(hidden_layers = [20, 15, 5], activation_functions = ['sigmoid', 'sigmoid', 'sigmoid'])
复制代码


Then just fit the data and predict values:

neural_net.fit(time_series, lag = 40, epochs = 10000)neural_net.predict_ahead(n_ahead = 30)

Did we get it right? Let's check

  1. import matplotlib.pyplot as plt
  2. plt.plot(range(len(neural_net.timeseries)), neural_net.timeseries, '-r', label='Predictions', linewidth=1)
  3. plt.plot(range(len(time_series)), time_series, '-g',  label='Original series')
  4. plt.title("Box & Jenkins AirPassenger data")
  5. plt.xlabel("Observation ordered index")
  6. plt.ylabel("No. of passengers")
  7. plt.legend()
  8. plt.show()
复制代码

本帖隐藏的内容

https://github.com/hawk31/nnet-ts



二维码

扫码加我 拉你入群

请注明:姓名-公司-职位

以便审核进群资格,未注明则拒绝

关键词:Architecture Forecasting Time Series Architect Forecast

本帖被以下文库推荐

沙发
Lisrelchen 发表于 2017-4-26 10:10:43 |只看作者 |坛友微信交流群
  1. import numpy as np
  2. from keras.models import Sequential
  3. from keras.layers.core import Dense, Activation
  4. from keras.optimizers import SGD
  5. from sklearn.preprocessing import StandardScaler
  6. import logging

  7. logging.basicConfig(format='%(levelname)s:%(message)s', level=logging.DEBUG)


  8. class TimeSeriesNnet(object):
  9.         def __init__(self, hidden_layers = [20, 15, 5], activation_functions = ['relu', 'relu', 'relu'],
  10.               optimizer = SGD(), loss = 'mean_absolute_error'):
  11.                 self.hidden_layers = hidden_layers
  12.                 self.activation_functions = activation_functions
  13.                 self.optimizer = optimizer
  14.                 self.loss = loss

  15.                 if len(self.hidden_layers) != len(self.activation_functions):
  16.                         raise Exception("hidden_layers size must match activation_functions size")

  17.         def fit(self, timeseries, lag = 7, epochs = 10000, verbose = 0):
  18.                 self.timeseries = np.array(timeseries, dtype = "float64") # Apply log transformation por variance stationarity
  19.                 self.lag = lag
  20.                 self.n = len(timeseries)
  21.                 if self.lag >= self.n:
  22.                         raise ValueError("Lag is higher than length of the timeseries")
  23.                 self.X = np.zeros((self.n - self.lag, self.lag), dtype = "float64")
  24.                 self.y = np.log(self.timeseries[self.lag:])
  25.                 self.epochs = epochs
  26.                 self.scaler = StandardScaler()
  27.                 self.verbose = verbose

  28.                 logging.info("Building regressor matrix")
  29.                 # Building X matrix
  30.                 for i in range(0, self.n - lag):
  31.                         self.X[i, :] = self.timeseries[range(i, i + lag)]

  32.                 logging.info("Scaling data")
  33.                 self.scaler.fit(self.X)
  34.                 self.X = self.scaler.transform(self.X)

  35.                 logging.info("Checking network consistency")
  36.                 # Neural net architecture
  37.                 self.nn = Sequential()
  38.                 self.nn.add(Dense(self.hidden_layers[0], input_shape = (self.X.shape[1],)))
  39.                 self.nn.add(Activation(self.activation_functions[0]))

  40.                 for layer_size, activation_function in zip(self.hidden_layers[1:],self.activation_functions[1:]):
  41.                         self.nn.add(Dense(layer_size))
  42.                         self.nn.add(Activation(activation_function))

  43.                 # Add final node
  44.                 self.nn.add(Dense(1))
  45.                 self.nn.add(Activation('linear'))
  46.                 self.nn.compile(loss = self.loss, optimizer = self.optimizer)

  47.                 logging.info("Training neural net")
  48.                 # Train neural net
  49.                 self.nn.fit(self.X, self.y, nb_epoch = self.epochs, verbose = self.verbose)

  50.         def predict_ahead(self, n_ahead = 1):
  51.                 # Store predictions and predict iteratively
  52.                 self.predictions = np.zeros(n_ahead)

  53.                 for i in range(n_ahead):
  54.                         self.current_x = self.timeseries[-self.lag:]
  55.                         self.current_x = self.current_x.reshape((1, self.lag))
  56.                         self.current_x = self.scaler.transform(self.current_x)
  57.                         self.next_pred = self.nn.predict(self.current_x)
  58.                         self.predictions[i] = np.exp(self.next_pred[0, 0])
  59.                         self.timeseries = np.concatenate((self.timeseries, np.exp(self.next_pred[0,:])), axis = 0)

  60.                 return self.predictions
复制代码

使用道具

藤椅
MouJack007 发表于 2017-4-26 12:22:28 |只看作者 |坛友微信交流群
谢谢楼主分享!

使用道具

板凳
MouJack007 发表于 2017-4-26 12:22:56 |只看作者 |坛友微信交流群

使用道具

您需要登录后才可以回帖 登录 | 我要注册

本版微信群
加好友,备注jltj
拉您入交流群

京ICP备16021002-2号 京B2-20170662号 京公网安备 11010802022788号 论坛法律顾问:王进律师 知识产权保护声明   免责及隐私声明

GMT+8, 2024-11-6 07:25