Strumenti Utente

Strumenti Sito


mds:pa:start

Differenze

Queste sono le differenze tra la revisione selezionata e la versione attuale della pagina.

Link a questa pagina di confronto

Entrambe le parti precedenti la revisione Revisione precedente
Prossima revisione
Revisione precedente
mds:pa:start [04/06/2017 alle 16:11 (7 anni fa)]
Nicola Ciaramella
mds:pa:start [08/09/2021 alle 12:25 (3 anni fa)] (versione attuale)
Nicola Ciaramella [Instructor]
Linea 1: Linea 1:
-====== Programmatic Advertising A.Y. 2016/17 ======+====== Programmatic Advertising A.Y. 2019/20 ======
  
 When you submit Google (or another search engine) a request, you fire an online auction: in much less than a second, many advertisers’ algorithms launch their offers and a Google algorithm chooses the winners, which gain the right to show you their advertisements. Similar events happen when you enter a web site, a social network or an app delivering you advertisements.\\  When you submit Google (or another search engine) a request, you fire an online auction: in much less than a second, many advertisers’ algorithms launch their offers and a Google algorithm chooses the winners, which gain the right to show you their advertisements. Similar events happen when you enter a web site, a social network or an app delivering you advertisements.\\ 
Linea 7: Linea 7:
 No advanced background in mathematics or computer science is required. No advanced background in mathematics or computer science is required.
  
-Contents:\\  + 
-Digital advertising and e-commerce: business context; programmatic vs. traditional approach; the profession of programmatic advertising; the economy of online auctions.\\  +=====Contents===== 
-Decision-making: expected utility, optimization, heuristics.\\  + 
-Multi-armed bandits: a conceptual framework for maximizing profit in a dynamic uncertain environment.\\  +  Digital advertising and e-commerce: business context; programmatic vs. traditional approach; the profession of programmatic advertising; the economy of online auctions. 
-Basic statistical tools: distributions (like Normal, Binomial, Beta); concept of simulation; Bayesian inference.\\  +  Decision-making: expected utility, optimization, heuristics. 
-Forecasting: regression; time series analysis; matrix-based methods.\\  +  Multi-armed bandits: a conceptual framework for maximizing profit in a dynamic uncertain environment. 
-Advertising and e-commerce with customer profiles available.+  Basic statistical tools: distributions (like Normal, Binomial, Beta); concept of simulation; Bayesian inference. 
 +  Forecasting: regression; time series analysis; matrix-based methods. 
 +  Advertising and e-commerce with customer profiles available. 
 + 
 +=====Instructor===== 
 + 
 +  * **Nicola Ciaramella** 
 +    * [[sidunipisa@gmail.com]]   
 + 
 +=====Teaching Material===== 
 + 
 +  * {{ :mds:pa:2.1_single_resource_capacity_allocation_-_part_1.pdf |2.1_single_resource_capacity_allocation_-_part_1.pdf}} 
 +  * {{ :mds:pa:2.2_single_resource_capacity_allocation_-_part_2.pdf |2.2_single_resource_capacity_allocation_-_part_2.pdf}} 
 +  * {{ :mds:pa:3.3._logistic_regression_example_phone_service_.pdf |3.3._logistic_regression_example_phone_service_.pdf}} 
 +  * {{ :mds:pa:bayes_in_medicine.pdf |bayes_in_medicine.pdf}} 
 +  * {{ :mds:pa:bayes_rule_slides_.pdf |bayes_rule_slides_.pdf}} 
 +  * {{ :mds:pa:decision_theory_-_airplane_example.ppt |decision_theory_-_airplane_example.ppt}} 
 +  * {{ :mds:pa:decision_theory_-_used_car_example.pdf |decision_theory_-_used_car_example.pdf}} 
 +  * {{ :mds:pa:dynamic_pricing_-_part_1.pdf |dynamic_pricing_-_part_1.pdf}} 
 +  * {{ :mds:pa:dynamic_pricing_-_part_2.pdf |dynamic_pricing_-_part_2.pdf}} 
 +  * {{ :mds:pa:logistic_regression_-_part_1.pdf |logistic_regression_-_part_1.pdf}} 
 +  * {{ :mds:pa:logistic_regression_-_part_2.pdf |logistic_regression_-_part_2.pdf}} 
 +  * {{ :mds:pa:multi_armed_bandits_-_part_1.pdf |multi_armed_bandits_-_part_1.pdf}} 
 +  * {{ :mds:pa:multi_armed_bandits_-_part_2.pdf |multi_armed_bandits_-_part_2.pd}} 
 +  * {{ :mds:pa:multi_armed_bandits_-_part_3.pdf |multi_armed_bandits_-_part_3.pdf}} 
 +  * {{ :mds:pa:multiarmed_bandits_word_-_part_1.pdf |multiarmed_bandits_word_-_part_1.pdf}} 
 +  * {{ :mds:pa:naive_bayes_ctr_prediction.pdf |naive_bayes_ctr_prediction.pdf}} 
 +  * {{ :mds:pa:naive_bayesian_prediction_-_example.pdf |naive_bayesian_prediction_-_example.pdf}} 
 +  * {{ :mds:pa:probability_distributions_-_part_1.pdf |probability_distributions_-_part_1.pdf}} 
 +  * {{ :mds:pa:probability_distributions_-_part_2.pdf |probability_distributions_-_part_2.pdf}} 
 +  * {{ :mds:pa:probability_distributions_-_part_3.pdf |robability_distributions_-_part_3.pdf}} 
 +  * {{ :mds:pa:probability_distributions_with_excel_-_example.xlsx |probability_distributions_with_excel_-_example.xlsx}} 
 +  * {{ :mds:pa:regression.pdf |regression.pdf}} 
 +  * {{ :mds:pa:trendlines.pdf |trendlines.pdf}} 
 +  * {{ :mds:pa:trendlines_examples.xlsx |trendlines_examples.xlsx}} 
 + 
 + 
mds/pa/start.1496592700.txt.gz · Ultima modifica: 04/06/2017 alle 16:11 (7 anni fa) da Nicola Ciaramella