我需要活跃用户的队列。 1天/7天/ 14天/28天。同谷歌Analytics(分析)。我似乎无法环绕需要查询我的头。
我知道我需要做自连接,但是这几乎是只要我能来。
我有了这个查询,但它产生一个错误的结果。
SELECT
DATE(logins.logged_in_at) AS `date`,
logins.company_id,
SUM(IF(d1.logged_in_at, 1, 0)) AS d1,
SUM(IF(d7.logged_in_at, 1, 0)) AS d7,
SUM(IF(d14.logged_in_at, 1, 0)) AS d14,
SUM(IF(d28.logged_in_at, 1, 0)) AS d28
FROM logins
LEFT JOIN logins AS d1
ON DATE(logins.logged_in_at) = (DATE(d1.logged_in_at) - INTERVAL 1 DAY)
LEFT JOIN logins AS d7
ON DATE(logins.logged_in_at) = (DATE(d7.logged_in_at) - INTERVAL 7 DAY)
LEFT JOIN logins AS d14
ON DATE(logins.logged_in_at) = (DATE(d14.logged_in_at) - INTERVAL 14 DAY)
LEFT JOIN logins AS d28
ON DATE(logins.logged_in_at) = (DATE(d28.logged_in_at) - INTERVAL 28 DAY)
GROUP BY
DATE(logins.logged_in_at),
logins.company_id
ORDER BY
logins.logged_in_at;
+------------+------------+----+----+-----+-----+
| date | company_id | d1 | d7 | d14 | d28 |
+------------+------------+----+----+-----+-----+
| 2019-01-01 | 1 | 8 | 8 | 8 | 8 |
| 2019-01-02 | 1 | 2 | 2 | 2 | 0 |
| 2019-01-03 | 1 | 2 | 2 | 2 | 0 |
| 2019-01-04 | 1 | 2 | 2 | 2 | 0 |
| 2019-01-05 | 1 | 2 | 2 | 2 | 0 |
| 2019-01-06 | 1 | 2 | 2 | 2 | 0 |
| 2019-01-07 | 1 | 6 | 6 | 6 | 0 |
| 2019-01-08 | 1 | 6 | 6 | 6 | 0 |
| 2019-01-09 | 1 | 6 | 6 | 6 | 0 |
| 2019-01-10 | 1 | 6 | 6 | 6 | 0 |
| 2019-01-11 | 1 | 6 | 6 | 6 | 0 |
| 2019-01-12 | 1 | 6 | 6 | 6 | 0 |
| 2019-01-13 | 1 | 12 | 12 | 12 | 0 |
| 2019-01-14 | 1 | 48 | 48 | 48 | 0 |
| 2019-01-15 | 1 | 48 | 48 | 48 | 0 |
| 2019-01-16 | 1 | 12 | 12 | 0 | 0 |
| 2019-01-17 | 1 | 12 | 12 | 0 | 0 |
| 2019-01-18 | 1 | 12 | 12 | 0 | 0 |
| 2019-01-19 | 1 | 12 | 12 | 0 | 0 |
| 2019-01-20 | 1 | 18 | 18 | 0 | 0 |
| 2019-01-21 | 1 | 36 | 36 | 0 | 0 |
| 2019-01-22 | 1 | 36 | 36 | 0 | 0 |
| 2019-01-23 | 1 | 9 | 0 | 0 | 0 |
| 2019-01-24 | 1 | 9 | 0 | 0 | 0 |
| 2019-01-25 | 1 | 9 | 0 | 0 | 0 |
| 2019-01-26 | 1 | 9 | 0 | 0 | 0 |
| 2019-01-27 | 1 | 12 | 0 | 0 | 0 |
| 2019-01-28 | 1 | 16 | 0 | 0 | 0 |
| 2019-01-29 | 1 | 0 | 0 | 0 | 0 |
+------------+------------+----+----+-----+-----+
表结构和样本数据,如下所示:
CREATE TABLE `logins` (
`id` int(10) unsigned NOT NULL AUTO_INCREMENT,
`company_id` int(11) NOT NULL,
`user_id` int(11) NOT NULL,
`logged_in_at` timestamp NOT NULL,
PRIMARY KEY (`id`),
KEY `logins_company_id_index` (`company_id`),
KEY `logins_user_id_index` (`user_id`),
KEY `logins_logged_in_at_index` (`logged_in_at`)
) ENGINE=InnoDB AUTO_INCREMENT=57 DEFAULT CHARSET=utf8 COLLATE=utf8_unicode_ci;
INSERT INTO `logins` (`id`, `company_id`, `user_id`, `logged_in_at`)
VALUES
(1, 1, 1, '2019-01-01 00:00:00'),
(2, 1, 1, '2019-01-02 00:00:00'),
(3, 1, 1, '2019-01-03 00:00:00'),
(4, 1, 1, '2019-01-04 00:00:00'),
(5, 1, 1, '2019-01-05 00:00:00'),
(6, 1, 1, '2019-01-06 00:00:00'),
(7, 1, 1, '2019-01-07 00:00:00'),
(8, 1, 1, '2019-01-08 00:00:00'),
(9, 1, 1, '2019-01-09 00:00:00'),
(10, 1, 1, '2019-01-10 00:00:00'),
(11, 1, 1, '2019-01-11 00:00:00'),
(12, 1, 1, '2019-01-12 00:00:00'),
(13, 1, 1, '2019-01-13 00:00:00'),
(14, 1, 1, '2019-01-14 00:00:00'),
(15, 1, 1, '2019-01-15 00:00:00'),
(16, 1, 1, '2019-01-16 00:00:00'),
(17, 1, 1, '2019-01-17 00:00:00'),
(18, 1, 1, '2019-01-18 00:00:00'),
(19, 1, 1, '2019-01-19 00:00:00'),
(20, 1, 1, '2019-01-20 00:00:00'),
(21, 1, 1, '2019-01-21 00:00:00'),
(22, 1, 1, '2019-01-22 00:00:00'),
(23, 1, 1, '2019-01-23 00:00:00'),
(24, 1, 1, '2019-01-24 00:00:00'),
(25, 1, 1, '2019-01-25 00:00:00'),
(26, 1, 1, '2019-01-26 00:00:00'),
(27, 1, 1, '2019-01-27 00:00:00'),
(28, 1, 1, '2019-01-28 00:00:00'),
(29, 1, 1, '2019-01-29 00:00:00'),
(30, 1, 2, '2019-01-14 00:00:00'),
(31, 1, 2, '2019-01-15 00:00:00'),
(32, 1, 2, '2019-01-16 00:00:00'),
(33, 1, 2, '2019-01-17 00:00:00'),
(34, 1, 2, '2019-01-18 00:00:00'),
(35, 1, 2, '2019-01-19 00:00:00'),
(36, 1, 2, '2019-01-20 00:00:00'),
(37, 1, 2, '2019-01-21 00:00:00'),
(38, 1, 2, '2019-01-22 00:00:00'),
(39, 1, 2, '2019-01-23 00:00:00'),
(40, 1, 2, '2019-01-24 00:00:00'),
(41, 1, 2, '2019-01-25 00:00:00'),
(42, 1, 2, '2019-01-26 00:00:00'),
(43, 1, 2, '2019-01-27 00:00:00'),
(44, 1, 2, '2019-01-28 00:00:00'),
(45, 1, 2, '2019-01-29 00:00:00'),
(46, 1, 3, '2019-01-21 00:00:00'),
(47, 1, 3, '2019-01-22 00:00:00'),
(48, 1, 3, '2019-01-23 00:00:00'),
(49, 1, 3, '2019-01-24 00:00:00'),
(50, 1, 3, '2019-01-25 00:00:00'),
(51, 1, 3, '2019-01-26 00:00:00'),
(52, 1, 3, '2019-01-27 00:00:00'),
(53, 1, 3, '2019-01-28 00:00:00'),
(54, 1, 3, '2019-01-29 00:00:00'),
(55, 1, 4, '2019-01-28 00:00:00'),
(56, 1, 4, '2019-01-29 00:00:00');
如果我跑需要在2019-01-29
查询,我希望得到的东西是这样的:
+------------+------------+----+----+-----+-----+
| company_id | date | d1 | d7 | d14 | d28 |
+------------+------------+----+----+-----+-----+
| 1 | 2019-01-29 | 4 | 3 | 2 | 1 |
+------------+------------+----+----+-----+-----+
请让我知道,如果你需要了解更多信息
只因为此查询产生相同的结果,我想这是你以后...
SELECT company_id
, date
, SUM(d1) d1
, SUM(d7) d7
, SUM(d14) d14
, SUM(d28) d28
FROM
( SELECT DISTINCT x.logged_in_at date
, x.company_id
, y.user_id
, y.logged_in_at = x.logged_in_at - INTERVAL 1 DAY d1
, y.logged_in_at = x.logged_in_at - INTERVAL 7 DAY d7
, y.logged_in_at = x.logged_in_at - INTERVAL 14 DAY d14
, y.logged_in_at = x.logged_in_at - INTERVAL 28 DAY d28
FROM logins x
JOIN logins y
ON y.company_id = x.company_id
AND y.logged_in_at < x.logged_in_at
WHERE x.logged_in_at = '2019-01-29'
) a
GROUP
BY company_id
, date;
下面的查询应该做的伎俩。内部查询指定在表中桶的每个记录,外部查询是否聚集。
SELECT
company_id,
@ref_date,
COUNT(DISTINCT CASE WHEN bucket = 'd1' THEN user_id END) d1,
COUNT(DISTINCT CASE WHEN bucket = 'd7' THEN user_id END) d7,
COUNT(DISTINCT CASE WHEN bucket = 'd14' THEN user_id END) d14,
COUNT(DISTINCT CASE WHEN bucket = 'd28' THEN user_id END) d28
FROM (
SELECT l.*,
CASE
WHEN l.logged_in_at >= @ref_date - INTERVAL 1 DAY THEN 'd1'
WHEN l.logged_in_at >= @ref_date - INTERVAL 8 DAY THEN 'd7'
WHEN l.logged_in_at >= @ref_date - INTERVAL 15 DAY THEN 'd14'
WHEN l.logged_in_at >= @ref_date - INTERVAL 29 DAY THEN 'd28'
END bucket
FROM logins l
WHERE l.logged_in_at < @ref_date
) x
GROUP BY company_id
有了您的样本数据in this db fiddle demo,这将返回:
| company_id | @ref_date | d1 | d7 | d14 | d28 |
| ---------- | ---------- | --- | --- | --- | --- |
| 1 | 2019-01-29 | 4 | 3 | 2 | 1 |