3. アメダス気象データ分析チャレンジ!(Python版) 2日目 課題

Copyright 2021 気象ビジネス推進コンソーシアム、岐阜大学 吉野純
(C) 2021 WXBC、岐阜大学 吉野純

<利用条件>
本書は、本書に記載した要件・技術・⽅式に関する内容が変更されないこと、および出典 を明示いただくことを前提に、無償でその全部または⼀部を複製、翻案、翻訳、転記、引 用、公衆送信等して利用できます。なお、全体を複製、翻案、翻訳された場合は、本書にあ る著作権表⽰および利用条件を明⽰してください。

<免責事項>
本書の著作権者は、本書の記載内容に関して、その正確性、商品性、利用目的への適合性 等に関して保証するものではなく、特許権、著作権、その他の権利を侵害していないことを 保証するものでもありません。本書の利用により生じた損害について、本書の著作権者は、 法律上のいかなる責任も負いません。

3.1 Pythonプログラムの全体像

一連のプロセスをひとまとめにしたPythonプログラムが以下のセルにあります.これは,同じフォルダーの中にあるpredictive_analytics.pyをそのままコピーしたものとなっています.これまでの説明と少し構成が異なる箇所もありますが(見つけてみて下さい),基本的に説明した内容で構成されています.

In [1]:
# -*- coding: utf-8 -*-
"""
The MIT License

Copyright 2021 WXBC,岐阜大学 吉野純
(C) 2021 WXBC,岐阜大学 吉野純

Permission is hereby granted, free of charge, to any person obtaining a copy 
of this software and associated documentation files (the "Software"), to deal 
in the Software without restriction, including without limitation the rights 
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell 
copies of the Software, and to permit persons to whom the Software is 
furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all 
copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR 
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY, 
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE 
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER 
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM, 
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE 
SOFTWARE.

以下に定める条件に従い、本ソフトウェアおよび関連文書のファイル(以下「ソ  
フトウェア」)の複製を取得するすべての人に対し、ソフトウェアを無制限に扱  
うことを無償で許可します。これには、ソフトウェアの複製を使用、複写、変  
更、結合、掲載、頒布、サブライセンス、および/または販売する権利、および  
ソフトウェアを提供する相手に同じことを許可する権利も無制限に含まれます。

上記の著作権表示および本許諾表示を、ソフトウェアのすべての複製または重要  
な部分に記載するものとします。

ソフトウェアは「現状のまま」で、明示であるか暗黙であるかを問わず、何らの  
保証もなく提供されます。ここでいう保証とは、商品性、特定の目的への適合  
性、および権利非侵害についての保証も含みますが、それに限定されるものでは  
ありません。 作者または著作権者は、契約行為、不法行為、またはそれ以外で  
あろうと、ソフトウェアに起因または関連し、あるいはソフトウェアの使用また  
はその他の扱いによって生じる一切の請求、損害、その他の義務について何らの  
責任も負わないものとします。

"""
# numpyをnpという別名でインポートします.
import numpy as np
# matplotlibをpltという別名でインポートします.
import matplotlib.pyplot as plt
# Seabornをインポートします.
import seaborn as sns
# datetimeは日時データを処理する際に便利なメソッドです.インポートします.
from datetime import datetime
# pandasをpdという別名でインポートします.
import pandas as pd
# matplotlibで時系列図を作成するときには以下をインポートします
from pandas.plotting import register_matplotlib_converters
# これを登録しておきます.
register_matplotlib_converters()
# sklearn(scikit-learn)の前処理のためのライブラリをインポートします
from sklearn import preprocessing
# sklearn(scikit-learn)の線形重回帰モデルをインポートします.
from sklearn import linear_model
# sklearn(scikit-learn)のニューラルネットワークのライブラリをインポートします
from sklearn import neural_network
# sklearn(scikit-learn)のサポートベクターマシンのライブラリをインポートします
from sklearn.svm import SVR
import pickle
#%precision 3
#%matplotlib inline

#def 1.############################################
# 気象庁アメダスの気温の時系列データを読み込んで,
# データフレームに割り当てる関数
def readamedas(filename,skipline): 
    # pandasのread_csvというメソッドでcsvファイルを読み込みます.
    # 引数として,
    # [0]入力ファイル名
    # [1]エンコーディング
    # [2]読み飛ばす行数,
    # [3]column名
    # [4]datetime型で読み込むcolumn名
    # [5]indexとするcolumn名
    # を与える.
    df=pd.read_csv(
       filename,
       encoding='Shift_JIS',
       skiprows=skipline,
       names=['date_time','T','dummy1','dummy2','Td','dummy3','dummy4'],
       parse_dates={'datetime':['date_time']}, 
       index_col='datetime'
       )
    return df

#def 2.############################################
# 東京電力の電力消費量の時系列データを読み込んで,
# DataFrameに割り当てる関数
def readtepco(filename,skipline): 
    # pandasのread_csvというメソッドでcsvファイルを読み込みます.
    # 引数として,
    # [0]入力ファイル名
    # [1]エンコーディング
    # [2]読み飛ばす行数,
    # [3]column名
    # [4]datetime型で読み込むcolumn名
    # [5]indexとするcolumn名
    # を与える.
    df=pd.read_csv(
       filename,
       encoding='Shift_JIS',
       skiprows=skipline,
       names=['date','time','POWER'],
       parse_dates={'datetime':['date','time']},
       index_col='datetime'
       )
    return df

#def 3.############################################
# 2つのデータフレームを結合し,欠測値を除去し,予測モデルの説明変数を
# 列として追加する関数.ここでは,湿数,時刻,日にち,曜日,月,過去の発電量
# を抽出・計算し,データフレームdfに連結している.
def dataprocess(x_data, y_data):
    # 2つのデータフレーム(x_dataとy_data)を結合してデータフレームdfとします.
    df=pd.merge(y_data, x_data, how="outer", left_index=True, right_index=True)
    # NaNがある行を取り除く
    df=df.dropna(how='any')
    # T-Tdにより湿数を計算して、T-Tdの列を作成し、もう使用しないTdは削除する.
    df['T-Td']=df['T']-df['Td']
    df=df.drop(['Td'],axis=1)
    # indexからhourを取り出しデータフレームhour(キー:'HOUR')する.
    # 3時間ごとに時刻の区分(キー)をHOUR1, HOUR2, …, HOUR8に分ける.
    # 該当すれば1,該当しなければ0とする.後の学習の説明変数で使用する.
    # もう使用しないHOURは削除する.
    hour=pd.DataFrame({'HOUR': df.index.hour}, index=df.index)
    hour['HOUR1']=( ( hour['HOUR'] >= 0 ) & ( hour['HOUR'] < 3 ) ).astype(int)
    hour['HOUR2']=( ( hour['HOUR'] >= 3 ) & ( hour['HOUR'] < 6 ) ).astype(int)
    hour['HOUR3']=( ( hour['HOUR'] >= 6 ) & ( hour['HOUR'] < 9 ) ).astype(int)
    hour['HOUR4']=( ( hour['HOUR'] >= 9 ) & ( hour['HOUR'] < 12 ) ).astype(int)
    hour['HOUR5']=( ( hour['HOUR'] >= 12 ) & ( hour['HOUR'] < 15 ) ).astype(int)
    hour['HOUR6']=( ( hour['HOUR'] >= 15 ) & ( hour['HOUR'] < 18 ) ).astype(int)
    hour['HOUR7']=( ( hour['HOUR'] >= 18 ) & ( hour['HOUR'] < 21 ) ).astype(int)
    hour['HOUR8']=( ( hour['HOUR'] >= 21 ) & ( hour['HOUR'] < 24 ) ).astype(int)
    hour=hour.drop(['HOUR'],axis=1)
    # データフレームdfにデータフレームhour列を追加する
    df=pd.merge(df, hour, how="outer", left_index=True, right_index=True)
    # indexからdayを取り出しデータフレームday(キー:'DAY')とする.
    # 10日間ごとに日付けの区分(キー)をDAY1, DAY2, DAY3に分ける.
    # 該当すれば1,該当しなければ0とする.後の学習の説明変数とする.
    # もう使用しないDAYは削除する.
    day=pd.DataFrame({'DAY': df.index.day}, index=df.index)
    day['DAY1']=( ( day['DAY'] >= 1 ) & ( day['DAY'] < 11 ) ).astype(int)
    day['DAY2']=( ( day['DAY'] >= 11 ) & ( day['DAY'] < 21 ) ).astype(int)
    day['DAY3']=( ( day['DAY'] >= 21 ) & ( day['DAY'] < 32 ) ).astype(int)
    day=day.drop(['DAY'],axis=1)
    # データフレームdfにデータフレームday列を追加する
    df=pd.merge(df, day, how="outer", left_index=True, right_index=True)
    # indexからweekdayを取り出しデータフレームweek(キー:'WEEK')とする.
    # 土日(5,6)ならば1、平日(0,1,2,3,4,)ならば0とする。 
    week=pd.DataFrame({'WEEK': df.index.weekday}, index=df.index)
    week['WEEK']=( week['WEEK'] >= 5 ).astype(int)
    # データフレームdfにデータフレームweek列を追加する
    df=pd.merge(df, week, how="outer", left_index=True, right_index=True)
    # indexからmonthを取り出しデータフレームmonth(キー:'MONTH')とする.
    # 3ヶ月ごとに月の区分(キー)をDJF, MAM, JJA, SONに分ける.
    # 該当すれば1,該当しなければ0とする.後の学習の説明変数とする.
    # もう使用しないMONTHは削除する.
    month=pd.DataFrame({'MONTH': df.index.month}, index=df.index)
    month['DJF']=( ( month['MONTH'] == 12 ) | ( month['MONTH'] ==  1 ) | ( month['MONTH'] ==  2 ) ).astype(int)
    month['MAM']=( ( month['MONTH'] ==  3 ) | ( month['MONTH'] ==  4 ) | ( month['MONTH'] ==  5 ) ).astype(int)
    month['JJA']=( ( month['MONTH'] ==  6 ) | ( month['MONTH'] ==  7 ) | ( month['MONTH'] ==  8 ) ).astype(int)
    month['SON']=( ( month['MONTH'] ==  9 ) | ( month['MONTH'] == 10 ) | ( month['MONTH'] == 11 ) ).astype(int)
    month=month.drop(['MONTH'],axis=1)
    # データフレームdfにデータフレームmonth列を追加する
    df=pd.merge(df, month, how="outer", left_index=True, right_index=True)
    # indexから'POWER'を取り出し24時間後にずらしてデータフレームpower24(キー:'POWER24')とする.
    # 後の学習の説明変数とする. 
    power24=pd.DataFrame({'POWER24': df['POWER'].shift(+24)}, index=df.index)
    # データフレームdfにデータフレームpower3列を追加する
    df=pd.merge(df, power24, how="outer", left_index=True, right_index=True)
    # indexから'POWER'を取り出し1時間後にずらしてデータフレームpower1(キー:'POWER1')とする.
    # 後の学習の説明変数とする. 
    power1=pd.DataFrame({'POWER1': df['POWER'].shift(+1)}, index=df.index)
    # データフレームdfにデータフレームpower1列を追加する
    df=pd.merge(df, power1, how="outer", left_index=True, right_index=True)
    # indexから'POWER'を取り出し2時間後にずらしてデータフレームpower2(キー:'POWER2')とする.
    # 後の学習の説明変数とする. 
    power2=pd.DataFrame({'POWER2': df['POWER'].shift(+2)}, index=df.index)
    # データフレームdfにデータフレームpower2列を追加する
    df=pd.merge(df, power2, how="outer", left_index=True, right_index=True)
    # indexから'POWER'を取り出し3時間後にずらしてデータフレームpower3(キー:'POWER3')とする.
    # 後の学習の説明変数とする. 
    power3=pd.DataFrame({'POWER3': df['POWER'].shift(+3)}, index=df.index)
    # データフレームdfにデータフレームpower3列を追加する
    df=pd.merge(df, power3, how="outer", left_index=True, right_index=True)
    # NaNがある行を取り除く 
    df=df.dropna(how='any')
    return df

#def 4.############################################
# データを学習用(train)と検証用(test)に別ける.
def preprocess(df,x_cols,y_cols,split):
    # データフレームdfからx_colsの列のデータの値(.values)を
    # float(浮動小数点型)として取り出す.
    x=df.loc[:,x_cols].values.astype('float')
    # データフレームdfからy_colsの列のデータの値(.values)を
    # float(浮動小数点型)として取り出す.
    # ravel()メソッドで1次元化する.
    y=df.loc[:,y_cols].values.astype('float').ravel()
    # 0からsplit-1までのxを学習用データとする
    x_train= x[:split]
    # 0からsplit-1までのyを学習用データとする.
    y_train= y[:split]
    # splitから終わりまでのxを検証用データとする.
    x_test= x[split:]
    # splitから終わりまでのyを検証用データとする.
    y_test= y[split:]
    return x_train,y_train,x_test,y_test

#def 5.############################################
# 予測モデルの結果(testおよびtrain)から時系列図を作成します
def timeseries(datetime_x, y1, y2, filename): 
    # dfのインデックス(時間)をXとする
    X=datetime_x
    # dfのname1列を指定してデータを取り出し,numpy配列で値をY1に与える.
    Y1=y1
    # dfのname1列を指定してデータを取り出し,numpy配列で値をY2に与える.
    Y2=y2
    # 時系列図の大きさを指定
    plt.figure(figsize=(20, 10))
    # y_obsvの時系列図
    plt.plot(X,Y1,color='blue',label='observed')
    # y_frcstの時系列図 
    plt.plot(X,Y2,color='red',label='predicted')
    # グラフのタイトル
    plt.title("Timeseries")
    # x軸のラベル
    plt.xlabel('Time')
    # y軸(左側の第1軸)のラベル
    plt.ylabel('Electric Power [$10^6$kW]')
    # 凡例(左上に置く) 
    plt.legend(loc='upper left')
    # 2つ目(name1)の凡例(右上に置く)
    # 保存するファイル名 
    plt.savefig(filename)
    # 図を閉じる
    plt.close()
    return

#def 6.############################################
# 予測モデルの結果(testおよびtrain)から散布図を作成します
def scatter(x, y,score, filename): 
    # 文字列"R2=score"
    score="R2="+str(score)
    # 散布図の大きさを指定
    plt.figure(figsize=(8, 8))
    # 散布図のプロット
    plt.plot(x, y, 'o')
    # Y=Xの線を引く
    plt.plot(x, x)
    # 文字列R2=r2)"を図の左上に置く
    plt.text(np.nanmin(x), np.nanmax(x), score)
    # グラフのタイトル
    plt.title("Scatter diagram")
    # x軸のラベル
    plt.xlabel('Observed Electric Power [$10^6$kW]')
    # y軸のラベル
    plt.ylabel('Predicted Electric Power [$10^6$kW]')    
    # 保存するファイル名 
    plt.savefig(filename)
    # 図を閉じる
    plt.close()
    return

###################################################
# ここから,メインプログラム
###################################################

#main 1.###########################################
# 気象庁AMeDAS(東京)の気温(℃)と露点温度(℃)のcsvファイルの名前を指定します.
filename1='amedas.csv'
# csvファイルの上から5行目まではデータではないため呼び飛ばします.
skipline1=5
# 気象庁AMeDAS(東京)の気温(℃)と露点温度(℃)のcsvファイルを読み込んで,
# pandasのDataFrame(tepco)に割り当てる関数を呼び出します.
amedas=readamedas(filename1,skipline1)
# DataFrame(amedas)の中のdummy1~dummy7の列を削除する.
amedas=amedas.drop(['dummy1','dummy2','dummy3','dummy4'],axis=1)
#amedas

#main 2.##########################################
# 東京電力の電力消費量(10^6 kW)のcsvファイルの名前を指定します.
filename2='tepco.csv'
# csvファイルの上から3行目まではデータではないため呼び飛ばします.
skipline2=3
# 東京電力の電力消費量(10^6 kW)のcsvファイルを読み込んで,
# pandasのデータフレーム(tepco)に割り当てる関数を呼び出します.
tepco=readtepco(filename2,skipline2)
# tepco

#main 3.##########################################
# 2つのデータフレーム(amedasとtepco)を結合して,
# 欠測値を除去し,気温以外に説明変数となりそうな情報を列に追加します.
data=dataprocess(amedas,tepco)

#main 4.##########################################
# データフレームdataの中から予測モデルの説明変数とするものを選ぶ
# さまざまな説明変数で試してみましょう.
x_cols=[]
# 気温
x_cols.extend(['T'])
# 湿数
#x_cols.extend(['T-Td'])
# 曜日
x_cols.extend(['WEEK'])
# 時間
x_cols.extend(['HOUR1','HOUR2','HOUR3','HOUR4','HOUR5','HOUR6','HOUR7','HOUR8'])
# 日にち
#x_cols.extend(['DAY1','DAY2','DAY3'])
# 月
#x_cols.extend(['DJF','MAM','JJA','SON'])
# 1日前の消費電力
#x_cols.extend(['POWER24'])
# 1時間前の消費電力
#x_cols.extend(['POWER1'])
# 2時間前の消費電力
#x_cols.extend(['POWER2'])
# 3時間前の消費電力
#x_cols.extend(['POWER3'])
# データフレームdataの中から予測モデルの目的変数とするものを選ぶ.
# ここでは発電量データ('POWER')を目的変数とする.
y_cols=['POWER']
# 学習用データと検証用データに別けるために,全体の行(len(data))を2で割る
dt1=int(len(data)/2)
# データフレームdataを訓練データ(x_train,y_train)と
# 検証データ(x_test,y_test)に別ける
# dt1より前半(主に2017年)のデータを訓練用データとする.
# dt1より後半(主に2018年)のデータを検証用データとする.
# リストx_colsで設定した列が説明変数として扱われる.
# リストy_colsで設定した列が目的変数として扱われる.
x_train, y_train, x_test, y_test=preprocess(data, x_cols, y_cols, dt1)
#print('x_train.shape=',x_train.shape)
#print('y_train.shape=',y_train.shape)
#print('x_test.shape=',x_test.shape)
#print('y_test.shape=',y_test.shape)

#main 5.##########################################
# 説明変数に対して,平均0,標準偏差1となるような標準化を行う.
# preprocessing.StandardScaler()からscalerというインスタンスを作成する.
scaler=preprocessing.StandardScaler()
# fitメソッドにより説明変数x_trainの平均と分散を計算して記憶する.
scaler.fit(x_train)
# 説明変数x_trainに対して標準化を行い,変換後の配列を返す.
x_train=scaler.transform(x_train)
# 説明変数x_testに対して標準化を行い,変換後の配列を返す.
x_test=scaler.transform(x_test)

#main 6.##########################################
# modelインスタンスを作成
# ニューラルネットワークモデルの場合:
#  solver="adam" 最適化手法(lbfgs, sgd, adam)
#  activateion="relu" 活性化関数(identify, logistic, tanh, relu)
#  max_iter=10000 反復の最大回数
#  tol=1e-5 学習の収束値
#  hidden_layer_sizes=(10,10,) 隠れ層のノード数(多層化可)
#  alpha=0.0001 L2正則化のペナルティー係数
#  batch_size='auto' バッチサイズの設定
#  random_state=1 重み係数やバイアスの初期値やバッチサンプリングに用いられる乱数の設定
model = neural_network.MLPRegressor(solver="adam", activation="relu", hidden_layer_sizes=(10,10,), max_iter=10000, tol=1e-5, random_state=1, verbose=True)
# 線形重回帰モデルの場合:
#model = linear_model.LinearRegression()
# 線形サポートベクターマシン(線形カーネル)の場合:
#model = SVR(kernel='linear', C=1e3)
# 非線形サポートベクターマシン(ガウスカーネル)の場合:
#  C=1e3 サポートベクターマシンと正則化項のバランス
#  gamma=0.1 訓練データを中心としたガウス分布の広がり(分散の逆数)を表す
#model = SVR(kernel='rbf', C=1e3, gamma=0.1)
# fitメソッドにより目的変数y_train,説明変数x_trainによる
# 予測モデルの最適化
model.fit(x_train, y_train)
# 予測モデルのパラメータを取得する
#print(model.get_params(deep=True))
# 学習済の予測モデルを保存する. 
#with open('model.pickle', mode='wb') as fp:
#    pickle.dump(model, fp)
# 学習済の予測モデルを復元する.
#with open('model.pickle', mode='rb') as fp:
#    model=pickle.load(fp)
# 決定係数R2(学習データ)
score_train=model.score(x_train, y_train)
print('train R2 score=', score_train)
# 決定係数R2(検証データ)
score_test=model.score(x_test, y_test)
print('test R2 score=', score_test)
# 予測データの作成(学習データ)
y_train_predict=model.predict(x_train)
# 予測データの作成(検証データ)
y_test_predict=model.predict(x_test)
# 標準化したデータを元に戻す(学習データ)
x_train=scaler.inverse_transform([x_train])
# 標準化したデータを元に戻す(検証データ)
x_test=scaler.inverse_transform([x_test])

#main 7.##########################################
# 予測モデルの結果の可視化
# 予測モデルに使用するデータフレームdataの中のindex(時間情報)を
# datetime_xとする
datetime_x=list(data.index)
# 2018年1月1日のデータの行番号を取得する
# 学習用データと検証用データに別けるために,全体の行(len(data))を2で割る
dt1=int(len(data)/2)
# 学習用データの1ヶ月分の行
dtm1=744
# 2018年2月1日のデータの行番号を取得する
dtm2=dt1+dtm1
# データを用いて時系列図を作成します.
# 訓練データの1年分の時系列図 ファイル名:timeseries_train.png
timeseries(datetime_x[:dt1],y_train,y_train_predict, "timeseries_train.png")
# 検証データの1年分の時系列図 ファイル名:timeseries_test.png
timeseries(datetime_x[dt1:],y_test,y_test_predict, "timeseries_test.png")
# 訓練データの1ヶ月分の時系列図 ファイル名:timeseries_train_jan.png
timeseries(datetime_x[:dtm1], y_train[:dtm1], y_train_predict[:dtm1], "timeseries_train_jan.png")
# 検証データの1ヶ月分の時系列図 ファイル名:timeseries_test_jan.png
timeseries(datetime_x[dt1:dtm2],y_test[:dtm1], y_test_predict[:dtm1], "timeseries_test_jan.png")
# データを用いて散布図を作成します.
# 訓練データの1年分の時系列 ファイル名:scatter_train.png
scatter(y_train, y_train_predict, score_train, "scatter_train.png")
# 検証データの1年分の時系列 ファイル名:scatter_test.png
scatter(y_test, y_test_predict, score_test, "scatter_test.png")
Iteration 1, loss = 5526009.39155152
Iteration 2, loss = 5521030.96747619
Iteration 3, loss = 5513918.77747958
Iteration 4, loss = 5503360.93906644
Iteration 5, loss = 5488744.91312786
Iteration 6, loss = 5468943.96139021
Iteration 7, loss = 5442011.25500002
Iteration 8, loss = 5405929.41143430
Iteration 9, loss = 5359532.71401770
Iteration 10, loss = 5302170.27276899
Iteration 11, loss = 5233092.36529831
Iteration 12, loss = 5151750.38233558
Iteration 13, loss = 5057722.97596615
Iteration 14, loss = 4950656.46586016
Iteration 15, loss = 4830359.24612216
Iteration 16, loss = 4697141.95840205
Iteration 17, loss = 4551118.73438669
Iteration 18, loss = 4392935.24560285
Iteration 19, loss = 4223366.77915567
Iteration 20, loss = 4043265.82035440
Iteration 21, loss = 3853553.71138942
Iteration 22, loss = 3655956.97768650
Iteration 23, loss = 3451733.74481843
Iteration 24, loss = 3242670.69626944
Iteration 25, loss = 3030745.68314583
Iteration 26, loss = 2817734.61542888
Iteration 27, loss = 2605063.60637487
Iteration 28, loss = 2395281.73540515
Iteration 29, loss = 2189678.48972077
Iteration 30, loss = 1990307.26379359
Iteration 31, loss = 1798498.30315708
Iteration 32, loss = 1615895.43304424
Iteration 33, loss = 1443722.35376947
Iteration 34, loss = 1282865.21203096
Iteration 35, loss = 1133869.04603910
Iteration 36, loss = 997496.44216448
Iteration 37, loss = 873799.12635523
Iteration 38, loss = 762485.47872092
Iteration 39, loss = 663750.35095017
Iteration 40, loss = 576534.29230934
Iteration 41, loss = 500571.25605225
Iteration 42, loss = 435008.91478709
Iteration 43, loss = 378693.28979490
Iteration 44, loss = 330826.35323368
Iteration 45, loss = 290377.48515395
Iteration 46, loss = 256485.77485695
Iteration 47, loss = 228214.30717182
Iteration 48, loss = 204663.30061767
Iteration 49, loss = 185129.71972775
Iteration 50, loss = 168988.04507984
Iteration 51, loss = 155562.19064881
Iteration 52, loss = 144429.01446109
Iteration 53, loss = 135136.27433365
Iteration 54, loss = 127342.24079211
Iteration 55, loss = 120835.43979339
Iteration 56, loss = 115331.11643160
Iteration 57, loss = 110644.54130416
Iteration 58, loss = 106669.99594952
Iteration 59, loss = 103270.43087533
Iteration 60, loss = 100352.31134427
Iteration 61, loss = 97837.35850474
Iteration 62, loss = 95674.43315839
Iteration 63, loss = 93798.49584594
Iteration 64, loss = 92178.53615280
Iteration 65, loss = 90763.14326763
Iteration 66, loss = 89546.02341607
Iteration 67, loss = 88485.35739127
Iteration 68, loss = 87557.26022417
Iteration 69, loss = 86739.67979828
Iteration 70, loss = 86040.14034749
Iteration 71, loss = 85405.53232848
Iteration 72, loss = 84851.87165165
Iteration 73, loss = 84367.70329660
Iteration 74, loss = 83921.10732234
Iteration 75, loss = 83534.65406575
Iteration 76, loss = 83187.31404234
Iteration 77, loss = 82862.90752547
Iteration 78, loss = 82571.77679455
Iteration 79, loss = 82295.76370597
Iteration 80, loss = 82055.05119918
Iteration 81, loss = 81819.90244336
Iteration 82, loss = 81605.90693108
Iteration 83, loss = 81398.66234563
Iteration 84, loss = 81198.73277596
Iteration 85, loss = 81003.30178102
Iteration 86, loss = 80815.78273902
Iteration 87, loss = 80643.93141851
Iteration 88, loss = 80470.55025254
Iteration 89, loss = 80303.92621614
Iteration 90, loss = 80139.62252890
Iteration 91, loss = 79978.02368561
Iteration 92, loss = 79820.68006669
Iteration 93, loss = 79671.09965625
Iteration 94, loss = 79514.92702785
Iteration 95, loss = 79367.85794928
Iteration 96, loss = 79210.99698420
Iteration 97, loss = 79055.79500687
Iteration 98, loss = 78905.85826320
Iteration 99, loss = 78738.66281846
Iteration 100, loss = 78567.49098945
Iteration 101, loss = 78408.10486622
Iteration 102, loss = 78243.10149558
Iteration 103, loss = 78066.35698586
Iteration 104, loss = 77924.96583569
Iteration 105, loss = 77740.37925394
Iteration 106, loss = 77584.29423179
Iteration 107, loss = 77429.43197369
Iteration 108, loss = 77262.78204945
Iteration 109, loss = 77109.23431246
Iteration 110, loss = 76953.93605286
Iteration 111, loss = 76795.33066611
Iteration 112, loss = 76651.73606408
Iteration 113, loss = 76508.46389127
Iteration 114, loss = 76369.26744072
Iteration 115, loss = 76236.55900612
Iteration 116, loss = 76102.08334963
Iteration 117, loss = 75971.36976012
Iteration 118, loss = 75855.57136230
Iteration 119, loss = 75729.82227621
Iteration 120, loss = 75625.84830808
Iteration 121, loss = 75474.02620521
Iteration 122, loss = 75344.76903397
Iteration 123, loss = 75227.99107586
Iteration 124, loss = 75096.20836315
Iteration 125, loss = 74962.92532583
Iteration 126, loss = 74835.01570336
Iteration 127, loss = 74705.34725505
Iteration 128, loss = 74580.48105994
Iteration 129, loss = 74451.01372253
Iteration 130, loss = 74331.47075704
Iteration 131, loss = 74204.75322531
Iteration 132, loss = 74098.77761793
Iteration 133, loss = 73983.97372114
Iteration 134, loss = 73859.77581502
Iteration 135, loss = 73758.51377042
Iteration 136, loss = 73659.32591739
Iteration 137, loss = 73545.15006385
Iteration 138, loss = 73471.63900830
Iteration 139, loss = 73337.44100377
Iteration 140, loss = 73231.82276117
Iteration 141, loss = 73133.99406410
Iteration 142, loss = 73022.61669153
Iteration 143, loss = 72936.92592504
Iteration 144, loss = 72834.07431306
Iteration 145, loss = 72735.76123111
Iteration 146, loss = 72636.48071571
Iteration 147, loss = 72517.56404604
Iteration 148, loss = 72433.67601485
Iteration 149, loss = 72337.99619461
Iteration 150, loss = 72238.84764281
Iteration 151, loss = 72160.70160511
Iteration 152, loss = 72066.64892797
Iteration 153, loss = 71992.44241407
Iteration 154, loss = 71887.18994627
Iteration 155, loss = 71807.36876855
Iteration 156, loss = 71699.43049341
Iteration 157, loss = 71632.91312805
Iteration 158, loss = 71546.25886137
Iteration 159, loss = 71462.25368900
Iteration 160, loss = 71367.39732927
Iteration 161, loss = 71273.97756735
Iteration 162, loss = 71193.50037361
Iteration 163, loss = 71082.04329324
Iteration 164, loss = 70992.15180136
Iteration 165, loss = 70906.88475990
Iteration 166, loss = 70802.88541209
Iteration 167, loss = 70702.16623198
Iteration 168, loss = 70595.39340006
Iteration 169, loss = 70489.68362005
Iteration 170, loss = 70394.54828364
Iteration 171, loss = 70302.42069323
Iteration 172, loss = 70174.02194652
Iteration 173, loss = 70098.82919836
Iteration 174, loss = 69982.38719662
Iteration 175, loss = 69883.17642068
Iteration 176, loss = 69807.11231402
Iteration 177, loss = 69731.35728357
Iteration 178, loss = 69621.08784036
Iteration 179, loss = 69558.14571034
Iteration 180, loss = 69432.76103948
Iteration 181, loss = 69352.99873004
Iteration 182, loss = 69269.98065073
Iteration 183, loss = 69179.21510569
Iteration 184, loss = 69081.16655269
Iteration 185, loss = 68989.19362428
Iteration 186, loss = 68886.31489542
Iteration 187, loss = 68783.99693936
Iteration 188, loss = 68674.03463633
Iteration 189, loss = 68578.40936636
Iteration 190, loss = 68470.96478004
Iteration 191, loss = 68362.62656792
Iteration 192, loss = 68208.87183829
Iteration 193, loss = 68066.49211085
Iteration 194, loss = 67892.56851577
Iteration 195, loss = 67725.96677211
Iteration 196, loss = 67556.43600465
Iteration 197, loss = 67336.02851186
Iteration 198, loss = 67147.09983360
Iteration 199, loss = 66942.97297238
Iteration 200, loss = 66736.98241246
Iteration 201, loss = 66566.14379184
Iteration 202, loss = 66406.04489834
Iteration 203, loss = 66201.34643076
Iteration 204, loss = 66024.38015032
Iteration 205, loss = 65864.38634688
Iteration 206, loss = 65674.31266108
Iteration 207, loss = 65480.31169217
Iteration 208, loss = 65298.26380797
Iteration 209, loss = 65101.08432525
Iteration 210, loss = 64745.57705548
Iteration 211, loss = 64490.94574600
Iteration 212, loss = 64218.63943805
Iteration 213, loss = 63987.11486465
Iteration 214, loss = 63752.45348520
Iteration 215, loss = 63549.22773275
Iteration 216, loss = 63301.06543883
Iteration 217, loss = 63061.75275380
Iteration 218, loss = 62846.88726884
Iteration 219, loss = 62599.90823028
Iteration 220, loss = 62338.09004689
Iteration 221, loss = 62063.34692032
Iteration 222, loss = 61804.86268407
Iteration 223, loss = 61560.08931889
Iteration 224, loss = 61311.51770224
Iteration 225, loss = 61063.89909210
Iteration 226, loss = 60820.89053198
Iteration 227, loss = 60545.20468957
Iteration 228, loss = 60318.54870925
Iteration 229, loss = 60066.48306553
Iteration 230, loss = 59780.91445733
Iteration 231, loss = 59529.60090102
Iteration 232, loss = 59265.00204052
Iteration 233, loss = 58959.58170429
Iteration 234, loss = 58679.19893954
Iteration 235, loss = 58378.81982631
Iteration 236, loss = 58077.57739621
Iteration 237, loss = 57777.65256198
Iteration 238, loss = 57476.65028310
Iteration 239, loss = 57178.02014488
Iteration 240, loss = 56866.76890916
Iteration 241, loss = 56530.54722118
Iteration 242, loss = 56196.04955146
Iteration 243, loss = 55850.31893267
Iteration 244, loss = 55491.70464333
Iteration 245, loss = 55041.90814633
Iteration 246, loss = 54600.15839112
Iteration 247, loss = 54176.72783143
Iteration 248, loss = 53800.75324281
Iteration 249, loss = 53445.13063151
Iteration 250, loss = 53104.40777543
Iteration 251, loss = 52717.61987868
Iteration 252, loss = 52372.07594424
Iteration 253, loss = 51997.55868817
Iteration 254, loss = 51615.81684575
Iteration 255, loss = 51238.98108781
Iteration 256, loss = 50820.24023888
Iteration 257, loss = 50388.94802731
Iteration 258, loss = 49978.70967263
Iteration 259, loss = 49527.14485004
Iteration 260, loss = 49119.00228441
Iteration 261, loss = 48743.86260742
Iteration 262, loss = 48352.41391749
Iteration 263, loss = 47970.41742218
Iteration 264, loss = 47607.15891848
Iteration 265, loss = 47211.77044119
Iteration 266, loss = 46838.00589840
Iteration 267, loss = 46449.57011788
Iteration 268, loss = 46029.97342753
Iteration 269, loss = 45633.66129242
Iteration 270, loss = 45184.98149400
Iteration 271, loss = 44734.34048843
Iteration 272, loss = 44269.49975708
Iteration 273, loss = 43770.51067911
Iteration 274, loss = 43290.66470321
Iteration 275, loss = 42827.30525583
Iteration 276, loss = 42368.04750025
Iteration 277, loss = 41929.42640964
Iteration 278, loss = 41477.24494143
Iteration 279, loss = 41033.33329225
Iteration 280, loss = 40604.62902712
Iteration 281, loss = 40174.58453113
Iteration 282, loss = 39732.87110891
Iteration 283, loss = 39308.44395813
Iteration 284, loss = 38896.54456412
Iteration 285, loss = 38462.44511707
Iteration 286, loss = 38066.03745752
Iteration 287, loss = 37655.54971989
Iteration 288, loss = 37249.23818971
Iteration 289, loss = 36882.79728508
Iteration 290, loss = 36489.71920877
Iteration 291, loss = 36136.93924635
Iteration 292, loss = 35768.89675438
Iteration 293, loss = 35412.70296576
Iteration 294, loss = 35056.08060907
Iteration 295, loss = 34726.25712217
Iteration 296, loss = 34369.05837175
Iteration 297, loss = 34054.56356511
Iteration 298, loss = 33733.48503423
Iteration 299, loss = 33391.05819128
Iteration 300, loss = 33084.91254598
Iteration 301, loss = 32767.48390717
Iteration 302, loss = 32468.72026029
Iteration 303, loss = 32177.67610720
Iteration 304, loss = 31900.77924441
Iteration 305, loss = 31626.75093865
Iteration 306, loss = 31381.54601184
Iteration 307, loss = 31121.07272360
Iteration 308, loss = 30884.89468103
Iteration 309, loss = 30658.96270010
Iteration 310, loss = 30424.47879317
Iteration 311, loss = 30226.85517858
Iteration 312, loss = 30030.73877701
Iteration 313, loss = 29855.67645187
Iteration 314, loss = 29696.39234624
Iteration 315, loss = 29486.58164886
Iteration 316, loss = 29337.90498082
Iteration 317, loss = 29183.59240350
Iteration 318, loss = 29029.75823048
Iteration 319, loss = 28885.48264390
Iteration 320, loss = 28751.86258107
Iteration 321, loss = 28620.09214594
Iteration 322, loss = 28502.50990547
Iteration 323, loss = 28369.73689315
Iteration 324, loss = 28253.71549955
Iteration 325, loss = 28135.97071832
Iteration 326, loss = 28039.60584656
Iteration 327, loss = 27920.38649898
Iteration 328, loss = 27832.09995599
Iteration 329, loss = 27737.41175254
Iteration 330, loss = 27641.74962103
Iteration 331, loss = 27562.76678803
Iteration 332, loss = 27489.04634618
Iteration 333, loss = 27407.05115226
Iteration 334, loss = 27342.53918061
Iteration 335, loss = 27259.11326930
Iteration 336, loss = 27199.12011933
Iteration 337, loss = 27134.94902631
Iteration 338, loss = 27078.65704583
Iteration 339, loss = 27018.83986563
Iteration 340, loss = 26972.98383664
Iteration 341, loss = 26929.10994873
Iteration 342, loss = 26865.13074592
Iteration 343, loss = 26814.71563995
Iteration 344, loss = 26768.63666822
Iteration 345, loss = 26717.98275442
Iteration 346, loss = 26699.46345256
Iteration 347, loss = 26662.21408424
Iteration 348, loss = 26602.15093370
Iteration 349, loss = 26564.59670345
Iteration 350, loss = 26521.50942203
Iteration 351, loss = 26490.28861309
Iteration 352, loss = 26465.85210562
Iteration 353, loss = 26419.63604835
Iteration 354, loss = 26397.14221173
Iteration 355, loss = 26392.06786651
Iteration 356, loss = 26331.84442870
Iteration 357, loss = 26318.54172632
Iteration 358, loss = 26267.45302545
Iteration 359, loss = 26266.35461193
Iteration 360, loss = 26226.00726800
Iteration 361, loss = 26191.86339508
Iteration 362, loss = 26176.57710334
Iteration 363, loss = 26149.28731990
Iteration 364, loss = 26130.76902186
Iteration 365, loss = 26096.02338532
Iteration 366, loss = 26085.99038187
Iteration 367, loss = 26066.65490711
Iteration 368, loss = 26037.50587238
Iteration 369, loss = 26028.78521877
Iteration 370, loss = 26002.77293252
Iteration 371, loss = 25977.37485791
Iteration 372, loss = 25962.61688409
Iteration 373, loss = 25940.93165184
Iteration 374, loss = 25917.74933655
Iteration 375, loss = 25904.81747025
Iteration 376, loss = 25882.69930130
Iteration 377, loss = 25868.16660695
Iteration 378, loss = 25842.50667258
Iteration 379, loss = 25845.66676443
Iteration 380, loss = 25818.71922182
Iteration 381, loss = 25798.66254246
Iteration 382, loss = 25790.86599401
Iteration 383, loss = 25756.91365030
Iteration 384, loss = 25755.51412764
Iteration 385, loss = 25734.54628524
Iteration 386, loss = 25740.01334573
Iteration 387, loss = 25720.41558809
Iteration 388, loss = 25683.01961205
Iteration 389, loss = 25680.03983774
Iteration 390, loss = 25667.15364364
Iteration 391, loss = 25645.77064213
Iteration 392, loss = 25661.42362678
Iteration 393, loss = 25640.67770310
Iteration 394, loss = 25601.98045036
Iteration 395, loss = 25596.07622044
Iteration 396, loss = 25593.71028938
Iteration 397, loss = 25578.07502197
Iteration 398, loss = 25563.82414967
Iteration 399, loss = 25559.47659047
Iteration 400, loss = 25538.07510173
Iteration 401, loss = 25524.67474584
Iteration 402, loss = 25519.46924675
Iteration 403, loss = 25503.42268022
Iteration 404, loss = 25487.31754604
Iteration 405, loss = 25472.62101640
Iteration 406, loss = 25473.31918570
Iteration 407, loss = 25459.16920847
Iteration 408, loss = 25446.15278809
Iteration 409, loss = 25429.43440504
Iteration 410, loss = 25414.64892088
Iteration 411, loss = 25431.33535986
Iteration 412, loss = 25401.94913677
Iteration 413, loss = 25402.70891743
Iteration 414, loss = 25385.48863166
Iteration 415, loss = 25367.81896775
Iteration 416, loss = 25351.35838476
Iteration 417, loss = 25354.84920264
Iteration 418, loss = 25338.91774560
Iteration 419, loss = 25334.53316854
Iteration 420, loss = 25316.09051339
Iteration 421, loss = 25303.54634259
Iteration 422, loss = 25303.46992224
Iteration 423, loss = 25279.98376709
Iteration 424, loss = 25264.31836658
Iteration 425, loss = 25251.63284679
Iteration 426, loss = 25237.59909099
Iteration 427, loss = 25225.91955151
Iteration 428, loss = 25216.50518557
Iteration 429, loss = 25197.26884166
Iteration 430, loss = 25189.74248528
Iteration 431, loss = 25181.73925916
Iteration 432, loss = 25178.98617197
Iteration 433, loss = 25176.10710212
Iteration 434, loss = 25150.94099393
Iteration 435, loss = 25123.08497890
Iteration 436, loss = 25129.81778763
Iteration 437, loss = 25124.63889637
Iteration 438, loss = 25129.48608518
Iteration 439, loss = 25114.02971114
Iteration 440, loss = 25086.57017604
Iteration 441, loss = 25077.33303426
Iteration 442, loss = 25071.94469670
Iteration 443, loss = 25060.04967944
Iteration 444, loss = 25055.46784132
Iteration 445, loss = 25045.15503869
Iteration 446, loss = 25043.74906239
Iteration 447, loss = 25031.71248447
Iteration 448, loss = 25028.27509140
Iteration 449, loss = 25014.36974658
Iteration 450, loss = 25006.23451627
Iteration 451, loss = 24989.56960986
Iteration 452, loss = 25015.84168981
Iteration 453, loss = 24988.08912229
Iteration 454, loss = 24986.32356984
Iteration 455, loss = 24975.10418324
Iteration 456, loss = 24966.72130975
Iteration 457, loss = 24961.38855562
Iteration 458, loss = 24959.10451957
Iteration 459, loss = 24937.25482200
Iteration 460, loss = 24947.49504337
Iteration 461, loss = 24945.79704600
Iteration 462, loss = 24928.86393379
Iteration 463, loss = 24927.54894681
Iteration 464, loss = 24915.44167768
Iteration 465, loss = 24916.93931838
Iteration 466, loss = 24905.38599109
Iteration 467, loss = 24901.21679312
Iteration 468, loss = 24897.43867564
Iteration 469, loss = 24893.52846541
Iteration 470, loss = 24900.54900545
Iteration 471, loss = 24891.14944947
Iteration 472, loss = 24874.73750001
Iteration 473, loss = 24882.82225265
Iteration 474, loss = 24872.07062626
Iteration 475, loss = 24868.08914011
Iteration 476, loss = 24873.16343363
Iteration 477, loss = 24842.40875639
Iteration 478, loss = 24854.14042381
Iteration 479, loss = 24859.49559406
Iteration 480, loss = 24848.73942774
Iteration 481, loss = 24861.12970665
Iteration 482, loss = 24829.60236512
Iteration 483, loss = 24853.15693442
Iteration 484, loss = 24815.31966909
Iteration 485, loss = 24824.58559652
Iteration 486, loss = 24808.97685075
Iteration 487, loss = 24814.14725277
Iteration 488, loss = 24796.15080471
Iteration 489, loss = 24803.03048936
Iteration 490, loss = 24804.67794927
Iteration 491, loss = 24781.35605186
Iteration 492, loss = 24795.50725503
Iteration 493, loss = 24790.34498380
Iteration 494, loss = 24786.90963792
Iteration 495, loss = 24775.39058185
Iteration 496, loss = 24775.65465739
Iteration 497, loss = 24778.70834585
Iteration 498, loss = 24758.17982763
Iteration 499, loss = 24772.46157485
Iteration 500, loss = 24751.48003722
Iteration 501, loss = 24752.42236991
Iteration 502, loss = 24752.61301915
Iteration 503, loss = 24752.93303694
Iteration 504, loss = 24743.09402400
Iteration 505, loss = 24764.52697005
Iteration 506, loss = 24730.77301245
Iteration 507, loss = 24725.53149400
Iteration 508, loss = 24735.45082286
Iteration 509, loss = 24731.23478605
Iteration 510, loss = 24715.09250999
Iteration 511, loss = 24717.44109231
Iteration 512, loss = 24707.86154945
Iteration 513, loss = 24704.48013083
Iteration 514, loss = 24709.39381564
Iteration 515, loss = 24699.82159574
Iteration 516, loss = 24705.55646992
Iteration 517, loss = 24707.71652103
Iteration 518, loss = 24692.82003141
Iteration 519, loss = 24690.32449948
Iteration 520, loss = 24672.52774041
Iteration 521, loss = 24682.73395859
Iteration 522, loss = 24674.74916391
Iteration 523, loss = 24662.97515983
Iteration 524, loss = 24663.16586903
Iteration 525, loss = 24663.56214148
Iteration 526, loss = 24649.37892532
Iteration 527, loss = 24654.17531685
Iteration 528, loss = 24654.15170352
Iteration 529, loss = 24652.73394863
Iteration 530, loss = 24640.33704742
Iteration 531, loss = 24633.45377172
Iteration 532, loss = 24649.04492721
Iteration 533, loss = 24625.06783046
Iteration 534, loss = 24625.73910653
Iteration 535, loss = 24624.37588082
Iteration 536, loss = 24619.14192082
Iteration 537, loss = 24620.26362766
Iteration 538, loss = 24632.53916502
Iteration 539, loss = 24605.96135948
Iteration 540, loss = 24614.59720841
Iteration 541, loss = 24595.04854348
Iteration 542, loss = 24603.28966282
Iteration 543, loss = 24600.87222149
Iteration 544, loss = 24596.44775251
Iteration 545, loss = 24578.43876174
Iteration 546, loss = 24588.38644173
Iteration 547, loss = 24579.12147300
Iteration 548, loss = 24572.77307521
Iteration 549, loss = 24578.42428694
Iteration 550, loss = 24573.86779952
Iteration 551, loss = 24564.07683860
Iteration 552, loss = 24564.29560692
Iteration 553, loss = 24567.60312503
Iteration 554, loss = 24568.45786773
Iteration 555, loss = 24576.67743953
Iteration 556, loss = 24548.85681108
Iteration 557, loss = 24552.85180165
Iteration 558, loss = 24544.49539712
Iteration 559, loss = 24544.72591151
Iteration 560, loss = 24537.94094543
Iteration 561, loss = 24542.97820206
Iteration 562, loss = 24539.98081315
Iteration 563, loss = 24538.48026394
Iteration 564, loss = 24528.49074599
Iteration 565, loss = 24529.25375825
Iteration 566, loss = 24522.77351568
Iteration 567, loss = 24520.70931774
Iteration 568, loss = 24529.59991830
Iteration 569, loss = 24513.06054038
Iteration 570, loss = 24519.65812405
Iteration 571, loss = 24517.33141606
Iteration 572, loss = 24514.10723073
Iteration 573, loss = 24495.04849222
Iteration 574, loss = 24501.43615701
Iteration 575, loss = 24508.39513098
Iteration 576, loss = 24487.47157624
Iteration 577, loss = 24486.84135226
Iteration 578, loss = 24482.00469192
Iteration 579, loss = 24487.23524825
Iteration 580, loss = 24509.68334100
Iteration 581, loss = 24482.09610864
Iteration 582, loss = 24474.17824055
Iteration 583, loss = 24500.01283933
Iteration 584, loss = 24474.41134049
Iteration 585, loss = 24464.21506712
Iteration 586, loss = 24469.75139670
Iteration 587, loss = 24477.13223399
Iteration 588, loss = 24487.74369932
Iteration 589, loss = 24465.75858094
Iteration 590, loss = 24458.39779826
Iteration 591, loss = 24454.45965897
Iteration 592, loss = 24456.58712460
Iteration 593, loss = 24453.28449678
Iteration 594, loss = 24444.53968347
Iteration 595, loss = 24449.33466268
Iteration 596, loss = 24453.03888270
Iteration 597, loss = 24446.06330575
Iteration 598, loss = 24434.34711606
Iteration 599, loss = 24430.48489020
Iteration 600, loss = 24426.29723571
Iteration 601, loss = 24422.85489651
Iteration 602, loss = 24420.33605582
Iteration 603, loss = 24421.81965737
Iteration 604, loss = 24418.11297801
Iteration 605, loss = 24439.74662482
Iteration 606, loss = 24411.94411690
Iteration 607, loss = 24443.89850876
Iteration 608, loss = 24418.42686798
Iteration 609, loss = 24405.49024546
Iteration 610, loss = 24397.61974784
Iteration 611, loss = 24416.15072869
Iteration 612, loss = 24417.09344003
Iteration 613, loss = 24425.47498458
Iteration 614, loss = 24399.30361404
Iteration 615, loss = 24395.36077099
Iteration 616, loss = 24398.26564212
Iteration 617, loss = 24382.72832986
Iteration 618, loss = 24383.62454332
Iteration 619, loss = 24382.09088382
Iteration 620, loss = 24377.52407640
Iteration 621, loss = 24400.70257173
Iteration 622, loss = 24385.50494274
Iteration 623, loss = 24383.26382535
Iteration 624, loss = 24364.15551974
Iteration 625, loss = 24385.87933093
Iteration 626, loss = 24375.94471865
Iteration 627, loss = 24371.92008165
Iteration 628, loss = 24362.47154918
Iteration 629, loss = 24373.15092142
Iteration 630, loss = 24350.82401263
Iteration 631, loss = 24356.44013687
Iteration 632, loss = 24353.96799793
Iteration 633, loss = 24346.15432320
Iteration 634, loss = 24343.48402814
Iteration 635, loss = 24352.55854700
Iteration 636, loss = 24339.61107613
Iteration 637, loss = 24333.49717213
Iteration 638, loss = 24345.17744163
Iteration 639, loss = 24338.52944259
Iteration 640, loss = 24324.20337676
Iteration 641, loss = 24332.80841031
Iteration 642, loss = 24320.99018177
Iteration 643, loss = 24328.22755194
Iteration 644, loss = 24338.19856597
Iteration 645, loss = 24333.38752618
Iteration 646, loss = 24324.95150420
Iteration 647, loss = 24313.44043688
Iteration 648, loss = 24309.68342451
Iteration 649, loss = 24315.05986216
Iteration 650, loss = 24309.46841514
Iteration 651, loss = 24300.43435648
Iteration 652, loss = 24305.45001406
Iteration 653, loss = 24294.02444061
Iteration 654, loss = 24296.50402190
Iteration 655, loss = 24297.91846515
Iteration 656, loss = 24299.25117126
Iteration 657, loss = 24290.31122614
Iteration 658, loss = 24293.36795048
Iteration 659, loss = 24294.18097784
Iteration 660, loss = 24278.85566510
Iteration 661, loss = 24292.50939055
Iteration 662, loss = 24281.65565269
Iteration 663, loss = 24286.13111059
Iteration 664, loss = 24285.49793739
Iteration 665, loss = 24272.29662124
Iteration 666, loss = 24273.27856016
Iteration 667, loss = 24281.50006015
Iteration 668, loss = 24262.93010277
Iteration 669, loss = 24255.49501669
Iteration 670, loss = 24258.45381553
Iteration 671, loss = 24263.38034862
Iteration 672, loss = 24277.20412829
Iteration 673, loss = 24266.47708687
Iteration 674, loss = 24255.08152905
Iteration 675, loss = 24249.88367857
Iteration 676, loss = 24258.80067631
Iteration 677, loss = 24260.60577685
Iteration 678, loss = 24253.12733942
Iteration 679, loss = 24249.22479266
Iteration 680, loss = 24250.75221294
Iteration 681, loss = 24268.67519172
Iteration 682, loss = 24242.30102274
Iteration 683, loss = 24232.74709564
Iteration 684, loss = 24234.30135894
Iteration 685, loss = 24236.10228798
Iteration 686, loss = 24241.45493106
Iteration 687, loss = 24234.76403654
Iteration 688, loss = 24237.45876326
Iteration 689, loss = 24234.69116786
Iteration 690, loss = 24237.42517739
Iteration 691, loss = 24248.45180880
Iteration 692, loss = 24217.78042607
Iteration 693, loss = 24236.61961957
Iteration 694, loss = 24221.89026044
Iteration 695, loss = 24214.00370156
Iteration 696, loss = 24214.76407566
Iteration 697, loss = 24214.70463534
Iteration 698, loss = 24221.84075692
Iteration 699, loss = 24214.34352557
Iteration 700, loss = 24214.78474311
Iteration 701, loss = 24209.94189639
Iteration 702, loss = 24220.66468754
Iteration 703, loss = 24224.35258935
Iteration 704, loss = 24217.05477129
Iteration 705, loss = 24215.59769598
Iteration 706, loss = 24193.88018822
Iteration 707, loss = 24211.33374235
Iteration 708, loss = 24210.05493304
Iteration 709, loss = 24193.21936380
Iteration 710, loss = 24200.56364884
Iteration 711, loss = 24187.86419332
Iteration 712, loss = 24186.58315272
Iteration 713, loss = 24192.30441078
Iteration 714, loss = 24185.53277905
Iteration 715, loss = 24181.82707390
Iteration 716, loss = 24187.50960939
Iteration 717, loss = 24196.40255890
Iteration 718, loss = 24192.04349786
Iteration 719, loss = 24192.93738818
Iteration 720, loss = 24204.55877067
Iteration 721, loss = 24171.89514635
Iteration 722, loss = 24173.12585907
Iteration 723, loss = 24166.35294916
Iteration 724, loss = 24159.67719457
Iteration 725, loss = 24178.33259962
Iteration 726, loss = 24164.01868684
Iteration 727, loss = 24189.24462797
Iteration 728, loss = 24164.93386175
Iteration 729, loss = 24159.17418991
Iteration 730, loss = 24162.83207300
Iteration 731, loss = 24163.59464797
Iteration 732, loss = 24177.80823364
Iteration 733, loss = 24172.29781918
Iteration 734, loss = 24160.95361360
Iteration 735, loss = 24162.81582655
Iteration 736, loss = 24159.08744949
Iteration 737, loss = 24168.42307540
Iteration 738, loss = 24153.63506594
Iteration 739, loss = 24170.14393718
Iteration 740, loss = 24143.07013953
Iteration 741, loss = 24150.20109143
Iteration 742, loss = 24152.55860206
Iteration 743, loss = 24145.73132956
Iteration 744, loss = 24159.41672939
Iteration 745, loss = 24148.02461678
Iteration 746, loss = 24144.03961321
Iteration 747, loss = 24142.01925275
Iteration 748, loss = 24153.03574489
Iteration 749, loss = 24137.46738280
Iteration 750, loss = 24147.57088591
Iteration 751, loss = 24160.08666744
Iteration 752, loss = 24138.90480243
Iteration 753, loss = 24137.34157152
Iteration 754, loss = 24137.79135628
Iteration 755, loss = 24141.34413626
Iteration 756, loss = 24133.88737238
Iteration 757, loss = 24127.14746650
Iteration 758, loss = 24144.46314762
Iteration 759, loss = 24133.84480961
Iteration 760, loss = 24135.87466267
Iteration 761, loss = 24133.99142754
Iteration 762, loss = 24130.46181088
Iteration 763, loss = 24120.50462358
Iteration 764, loss = 24130.13009435
Iteration 765, loss = 24116.84375611
Iteration 766, loss = 24130.78600442
Iteration 767, loss = 24146.47162761
Iteration 768, loss = 24123.44461690
Iteration 769, loss = 24123.55694354
Iteration 770, loss = 24134.25181701
Iteration 771, loss = 24123.72620876
Iteration 772, loss = 24114.33596837
Iteration 773, loss = 24129.76240435
Iteration 774, loss = 24140.08622093
Iteration 775, loss = 24127.96246910
Iteration 776, loss = 24110.13843567
Iteration 777, loss = 24115.25072331
Iteration 778, loss = 24126.96528499
Iteration 779, loss = 24114.88149357
Iteration 780, loss = 24109.56383703
Iteration 781, loss = 24103.81933425
Iteration 782, loss = 24116.39125317
Iteration 783, loss = 24121.58377503
Iteration 784, loss = 24107.81965645
Iteration 785, loss = 24114.25928480
Iteration 786, loss = 24117.63501283
Iteration 787, loss = 24105.67808069
Iteration 788, loss = 24105.53329327
Iteration 789, loss = 24106.24584247
Iteration 790, loss = 24100.91344471
Iteration 791, loss = 24108.79807854
Iteration 792, loss = 24118.75615034
Iteration 793, loss = 24096.65494942
Iteration 794, loss = 24107.77719369
Iteration 795, loss = 24103.77591660
Iteration 796, loss = 24108.32438056
Iteration 797, loss = 24122.27252679
Iteration 798, loss = 24098.36596943
Iteration 799, loss = 24095.93209765
Iteration 800, loss = 24104.72964076
Iteration 801, loss = 24102.23677993
Iteration 802, loss = 24094.32903566
Iteration 803, loss = 24099.07211507
Iteration 804, loss = 24104.75197548
Iteration 805, loss = 24100.55388481
Iteration 806, loss = 24103.91752554
Iteration 807, loss = 24089.91038593
Iteration 808, loss = 24101.01603622
Iteration 809, loss = 24088.26480056
Iteration 810, loss = 24102.80534088
Iteration 811, loss = 24084.70908784
Iteration 812, loss = 24092.58352411
Iteration 813, loss = 24094.22036347
Iteration 814, loss = 24100.42401689
Iteration 815, loss = 24099.83269656
Iteration 816, loss = 24083.10801264
Iteration 817, loss = 24089.10580237
Iteration 818, loss = 24096.19923371
Iteration 819, loss = 24080.97851890
Iteration 820, loss = 24096.30158764
Iteration 821, loss = 24076.61520751
Iteration 822, loss = 24094.22490309
Iteration 823, loss = 24081.43879402
Iteration 824, loss = 24094.30087390
Iteration 825, loss = 24075.56764537
Iteration 826, loss = 24089.02282661
Iteration 827, loss = 24076.29358873
Iteration 828, loss = 24089.65735564
Iteration 829, loss = 24081.91563205
Iteration 830, loss = 24099.90305851
Iteration 831, loss = 24069.82654357
Iteration 832, loss = 24075.27474529
Iteration 833, loss = 24081.09808310
Iteration 834, loss = 24072.49909765
Iteration 835, loss = 24065.17063459
Iteration 836, loss = 24082.75877311
Iteration 837, loss = 24071.94559666
Iteration 838, loss = 24081.62551839
Iteration 839, loss = 24069.98628603
Iteration 840, loss = 24067.00754319
Iteration 841, loss = 24064.65258967
Iteration 842, loss = 24079.21989564
Iteration 843, loss = 24064.05964430
Iteration 844, loss = 24069.95438478
Iteration 845, loss = 24077.45275966
Iteration 846, loss = 24050.77918750
Iteration 847, loss = 24078.54426970
Iteration 848, loss = 24076.85277752
Iteration 849, loss = 24076.51088421
Iteration 850, loss = 24064.18582990
Iteration 851, loss = 24075.42785565
Iteration 852, loss = 24063.17166332
Iteration 853, loss = 24060.48874850
Iteration 854, loss = 24059.49998649
Iteration 855, loss = 24059.33152804
Iteration 856, loss = 24054.28714868
Iteration 857, loss = 24063.43881214
Training loss did not improve more than tol=0.000010 for 10 consecutive epochs. Stopping.
train R2 score= 0.8757665807741901
test R2 score= 0.8665886639074808

3.2 自分で分析してみよう

課題1.ニューラルネットワークのハイパーパラメータや説明変数を調整してより高精度な電力消費量の統計的予測モデルを作ろう.

In [ ]:
# 上のプログラムをベースとして,解答プログラムをここに入力して下さい

課題2.ニューラルネットワーク(neural_networkクラス)と線形重回帰分析(linear_modelクラス)の統計的予測モデルを比較しよう.

In [ ]:
# 上のプログラムをベースとして,解答プログラムをここに入力して下さい

課題3.2019年を学習用データ,2020年を検証用データとして入手し,コロナ禍(2020年)の電力消費量が2019年の学習で予測可能であったかどうかを検証してみよう.

In [ ]:
# 上のプログラムをベースとして,解答プログラムをここに入力して下さい