我正在分析带有pandas的应用程序的输出日志,并希望将每个条目分配到一个会话中。会话定义为从开始的60分钟。
这是一个小例子:
import numpy as np
import pandas as pd
from datetime import timedelta
> df = pd.DataFrame({
'time': [
pd.Timestamp(2019, 1, 1, 1, 10),
pd.Timestamp(2019, 1, 1, 1, 15),
pd.Timestamp(2019, 1, 1, 1, 20),
pd.Timestamp(2019, 1, 1, 2, 20),
pd.Timestamp(2019, 1, 1, 5, 0),
pd.Timestamp(2019, 1, 1, 5, 15)
]
})
> df
time
0 2019-01-01 01:10:00
1 2019-01-01 01:15:00
2 2019-01-01 01:20:00
3 2019-01-01 02:20:00
4 2019-01-01 05:00:00
5 2019-01-01 05:15:00
对于第一行,start_time
等于time
。对于后续行,如果其time
在前一行的1小时内,则认为它在同一会话中。如果没有,它将与start_time = time
开始一个新的会话。我正在使用循环:
df['start_time'] = np.nan
for index in df.index:
if index == 0:
start_time = df['time'][index]
else:
delta = df['time'][index] - df['time'][index - 1]
start_time = df['start_time'][index - 1] if delta.total_seconds() <= 3600 else df['time'][index]
df['start_time'][index] = start_time
输出:
time start_time
0 2019-01-01 01:10:00 2019-01-01 01:10:00
1 2019-01-01 01:15:00 2019-01-01 01:10:00
2 2019-01-01 01:20:00 2019-01-01 01:10:00
3 2019-01-01 02:20:00 2019-01-01 01:10:00
4 2019-01-01 05:00:00 2019-01-01 05:00:00 # new session
5 2019-01-01 05:15:00 2019-01-01 05:00:00
它工作但很慢。有没有办法对它进行矢量化?
使用diff
和cumsum
创建组密钥,然后我们只使用该密钥获取每个组的first
值
s=(df.time.diff()/np.timedelta64(1, 's')).gt(3600).cumsum()
df.groupby(s)['time'].transform('first')
Out[833]:
0 2019-01-01 01:10:00
1 2019-01-01 01:10:00
2 2019-01-01 01:10:00
3 2019-01-01 01:10:00
4 2019-01-01 05:00:00
5 2019-01-01 05:00:00
Name: time, dtype: datetime64[ns]
df['statr_time']=df.groupby(s)['time'].transform('first')
我使用np where,shift和cumsum来创建会话ID。然后我使用transform和min来获得开始时间
df['session_id'] = np.where((df['time'] - df['time'].shift(1)).astype('timedelta64[m]').fillna(0)>60,1,0).cumsum()
df['start_time'] = df.groupby(['session_id'])['time'].transform(min)
display(df)
time session_id start_time
0 2019-01-01 01:10:00 0 2019-01-01 01:10:00
1 2019-01-01 01:15:00 0 2019-01-01 01:10:00
2 2019-01-01 01:20:00 0 2019-01-01 01:10:00
3 2019-01-01 02:20:00 0 2019-01-01 01:10:00
4 2019-01-01 05:00:00 1 2019-01-01 05:00:00
5 2019-01-01 05:15:00 1 2019-01-01 05:00:00