Working with Per-Group and Overall Summary Values Simultaneously : Group « Select Clause « SQL / MySQL






Working with Per-Group and Overall Summary Values Simultaneously

      
mysql>
mysql> CREATE TABLE mytable
    -> (
    ->  rec_id          INT UNSIGNED NOT NULL AUTO_INCREMENT,
    ->  name            VARCHAR(20) NOT NULL,
    ->  trav_date       DATE NOT NULL,
    ->  miles           INT NOT NULL,
    ->  PRIMARY KEY (rec_id)
    -> );
Query OK, 0 rows affected (0.00 sec)

mysql>
mysql> INSERT INTO mytable (name,trav_date,miles)
    ->  VALUES
    ->          ('Ben','2010-11-30',152),
    ->          ('Suzi','2010-11-29',391),
    ->          ('Henry','2010-11-29',300),
    ->          ('Henry','2010-11-27',96),
    ->          ('Ben','2010-11-29',131),
    ->          ('Henry','2010-11-26',115),
    ->          ('Suzi','2010-12-02',502),
    ->          ('Henry','2010-12-01',197),
    ->          ('Ben','2010-12-02',79),
    ->          ('Henry','2010-11-30',203)
    -> ;
Query OK, 10 rows affected (0.00 sec)
Records: 10  Duplicates: 0  Warnings: 0

mysql>
mysql>
mysql> SELECT @total := SUM(miles) AS 'total miles' FROM mytable;
+-------------+
| total miles |
+-------------+
|        2166 |
+-------------+
1 row in set (0.00 sec)

mysql>
mysql>
mysql> SELECT name,
    -> SUM(miles) AS 'miles/driver',
    -> (SUM(miles)*100)/@total AS 'percent of total miles'
    -> FROM mytable GROUP BY name;
+-------+--------------+------------------------+
| name  | miles/driver | percent of total miles |
+-------+--------------+------------------------+
| Ben   |          362 |                16.7128 |
| Henry |          911 |                42.0591 |
| Suzi  |          893 |                41.2281 |
+-------+--------------+------------------------+
3 rows in set (0.00 sec)

mysql>
mysql> drop table mytable;
Query OK, 0 rows affected (0.00 sec)

mysql>

   
    
    
    
    
    
  








Related examples in the same category

1.Use GROUP BY clause to list only the unique data
2.Another GROUP BY
3.Use GROUP BY
4.GROUP BY with order and HAVING
5.Use GROUP BY 2
6.Use GROUP BY and ORDER BY together
7.Get GROUP BY for COUNT
8.Simple GROUP BY
9.GROUP and sort the records
10.GROUP value and count
11.Grouping Data: Filtering Group Data
12.Grouping Data: 03 Using the HAVING Clause 1
13.Grouping Data: Using the HAVING Clause 1
14.Using math function in HAVING
15.GROUP and HAVING with sub query
16.Group by and order by for linked tables
17.Grouping by Expression Results
18.Give the expression an alias in the output column list and refer to the alias in the GROUP BY clause
19.Write the GROUP BY clause to refer to the output column position:
20.Group by multiple expressions if you like.
21.Group mail table records into categories of 100,000 bytes
22.Group by DAYOFWEEK( )
23.Finding Rows Containing Per-Group Minimum or Maximum Values
24.Maximum-per-group problem for this table
25.Another way to group statements is to turn off auto-commit mode explicitly.
26.To use a GROUP BY clause effectively, you should also include a select list element that contains a function t
27.Specifies two columns in the GROUP BY clause
28.Working with Grouped Data
29.Group by calculated value
30.Group value in subquery
31.Group by two columns
32.Group by then order by vs Group by only
33.GROUP BY for Several Columns
34.GROUP BY returns a final sum for the first column and supplementary partial sums for the second column.
35.Dividing a Summary into Subgroups
36.Use the name column to place the rows in groups, but the summary functions operate on the miles values:
37.Parentheses may be used to group alternations.
38.To be more specific and find out how many messages each sender sent from each host, use two grouping columns.
39.Categorize groups on a logarithmic scale.
40.Missing and non-missing counts can be determined for subgroups as well.
41.get the number of orders per customer
42.A less fine-grained summary can be obtained by using only the month values
43.Find out how many books you have per author, use this query
44.Ascertain the most common initial letter for state names like this: