Please use this identifier to cite or link to this item:
https://repository.iimb.ac.in/handle/2074/9777
DC Field | Value | Language |
---|---|---|
dc.contributor.advisor | Basu, Arnab | |
dc.contributor.author | Mohapatra, Prakash | |
dc.date.accessioned | 2019-07-23T09:02:31Z | - |
dc.date.available | 2019-07-23T09:02:31Z | - |
dc.date.issued | 2013 | |
dc.identifier.uri | http://repository.iimb.ac.in/handle/2074/9777 | |
dc.description.abstract | In this paper, we make a three-fold contribution to the domain of reinforcement learning of equilibrium in the framework of nonzero-sum stochastic dynamic games. First of all, we extend the techniques of Q( )- learning to the multi-player setup. We also extend the idea of polynomial learning rate to this domain for faster convergence. Most importantly, we propose a novel nonlinear learning algorithm which eliminates the learning starvation typical of such linear learning algorithms such as Q( )-learning. Prior work in the reinforcement learning domain is mainly restricted to linear techniques which lead to learning starvation. Our learning objective is the in nite horizon discounted pay-o criterion which is used to estimate the long term market equilibria. We have applied this model to a real life business case to analyze the competition between ARM and Intel in the smart phone microprocessor market. The model is restricted to a duopoly; however, the work can be easily extended to the more general case. We have estimated the market equilibrium payo s for this set-up and proposed some business insights based on our ndings. | |
dc.language.iso | en_US | |
dc.publisher | Indian Institute of Management Bangalore | |
dc.relation.ispartofseries | EPGP_P13_09 | |
dc.subject | Marketing management | |
dc.title | Nonlinear reinforcement learning of dynamic Nash equilibrium | |
dc.type | Project Report-EPGP | |
dc.pages | 50p. | |
Appears in Collections: | 2010-2015 |
Files in This Item:
File | Size | Format | |
---|---|---|---|
EPGP_P13_1214046.pdf | 433.47 kB | Adobe PDF | View/Open Request a copy |
Google ScholarTM
Check
Items in DSpace are protected by copyright, with all rights reserved, unless otherwise indicated.