R data.table如果使用join查找则平均

问题描述 投票:0回答:3

我想做的只是一个简单的平均值(就像在excel中的命令平均值一样)。我正在使用data.tables来提高效率,因为我有相当大的表(~1m行)。

我的目标是抬头看看

Table 1 
| individual id | date        |
-------------------------------
| 1             |  2018-01-02 |
| 1             |  2018-01-03 |
| 2             |  2018-01-02 |
| 2             |  2018-01-03 |

Table 2 
| individual id | date2       | alpha |
---------------------------------------
| 1             |  2018-01-02 |  1    |  
| 1             |  2018-01-04 |  1.5  |
| 1             |  2018-01-05 |  1    |
| 2             |  2018-01-01 |  2    |  
| 2             |  2018-01-02 |  1    |
| 2             |  2018-01-05 |  4    |

目标结果

Updated table 1
| individual id | date        | mean(alpha) |
---------------------------------------------
| 1             |  2018-01-02 |  1          |
| 1             |  2018-01-03 |  1          |
| 2             |  2018-01-02 | 1.5         |
| 2             |  2018-01-03 | 1.5         |

这只是表2中此个体的所有值的平均值,它发生在(包括)日期之前(日期2)。结果可以通过以下mysql命令生成,但我无法在R中重现它。

update table1
            set daily_alpha_avg = 
      (select avg(case when date2<date then alpha else 0 end) 
      from table2
      where table2.individual_id= table1.individual_id
      group by individual_id);

到目前为止,我最好的猜测是:

table1[table2, on = .(individual_id, date>=date2), 
          .(x.individual_id, x.date, bb = mean(alpha)), by= .(x.date, x.individual_id)]

要么

table1[, daily_alpha_avg := table2[table1, mean(alpha), on =.(individual_id, date>=date2)]]

但这不起作用,我知道它错了我只是不知道如何解决它。

谢谢你的帮助

r data.table rmysql
3个回答
4
投票

使用by = .EACHI您可以执行以下操作:

table2[table1, 
       on = .(`individual id`), 
       .(date = i.date, mean_alpha = mean(alpha[date2 <= i.date])),
       by = .EACHI]

#    individual id       date mean_alpha
# 1:             1 2018-01-02        1.0
# 2:             1 2018-01-03        1.0
# 3:             2 2018-01-02        1.5
# 4:             2 2018-01-03        1.5

编辑:

# Assign by reference as a new column
table1[, mean_alpha := table2[table1, 
                              on = .(`individual id`), 
                              mean(alpha[date2 <= i.date]),
                              by = .EACHI][["V1"]]]

编辑2:

这是Frank在评论部分中建议的稍微优雅的方式。

# In this solution our date columns can't be type character
table1[, date := as.Date(date)]
table2[, date2 := as.Date(date2)]

table1[, mean_alpha := table2[table1, # or equivalently .SD instead of table1
                              on = .(`individual id`, date2 <= date), 
                              mean(alpha), 
                              by = .EACHI][["V1"]]]

可重复的数据

table1 <- fread(
  "individual id | date       
   1             |  2018-01-02
   1             |  2018-01-03
   2             |  2018-01-02
   2             |  2018-01-03", 
  sep ="|"
)
table2 <- fread(
  "individual id | date2       | alpha
   1             |  2018-01-02 |  1     
   1             |  2018-01-04 |  1.5 
   1             |  2018-01-05 |  1   
   2             |  2018-01-01 |  2     
   2             |  2018-01-02 |  1   
   2             |  2018-01-05 |  4",
  sep = "|"
)

1
投票

整齐的表现对你来说还不够吗?

我只能使用date2 <date来复制你的表,所以我添加了=。

#Please provide 

table1 <- tribble(~individual_id,~date,
                  1,"2018-01-02",
                  1,"2018-01-03",
                  2,"2018-01-02",
                  2,"2018-01-03")

table2 <- tribble(~individual_id,~date2,~alpha,
                  1,"2018-01-02",1,
                  1,"2018-01-04",1.5,
                  1,"2018-01-05",1,
                  2,"2018-01-01",2,
                  2,"2018-01-02",1,
                  2,"2018-01-05",4)

df <- left_join(table1,table2) %>%
  mutate(date = as.Date(date),
         date2 = as.Date(date2))

df %>% 
  group_by(individual_id,date) %>% 
  mutate(case = ifelse(date2<=date,alpha,NA)) %>% 
  summarise(mean_alpha = mean(case,na.rm = TRUE))

你可以选择使用tidyverse来生成sql查询,并且有sql_translations,检查https://dbplyr.tidyverse.org/articles/sql-translation.html并使用show_query函数来确保你在sql和R之间使用相同的逻辑


0
投票

只需使用sqldf包,并将您的查询放在sqldf()中。

library(sqldf)
sqldf("your SQL goes here")
table1

而已

© www.soinside.com 2019 - 2024. All rights reserved.