## AI helps you reading Science

## AI Insight

AI extracts a summary of this paper

Weibo:

# The Adaptive Complexity of Maximizing a Gross Substitutes Valuation

NIPS 2020, (2020)

EI

Keywords

Abstract

In this paper, we study the adaptive complexity of maximizing a monotone gross substitutes function under a cardinality constraint. Our main result is an algorithm that achieves a 1 − approximation in O(log n) adaptive rounds for any constant > 0, which is an exponential speedup in parallel running time compared to previously studied algo...More

Code:

Data:

Introduction

- The authors study the problem of maximizing gross substitutes functions in the adaptive complexity model.
- It is well known that a greedy algorithm that iteratively selects the element with the maximal marginal contribution to its current solution obtains a 1 − 1/e approximation for maximization under a cardinality constraint [30] and that this bound is optimal for polynomial-time algorithms [29, 19].
- The authors first show that the number of rounds needed to find a solution that is arbitrarily close to optimal for maximizing monotone gross substitutes under a cardinality constraint is O(log n).

Highlights

- In this paper, we study the problem of maximizing gross substitutes functions in the adaptive complexity model
- The concept of gross substitutes was first introduced in the seminal work by Arrow and Debreu as a sufficient condition on the valuation functions of buyers to guarantee the existence of equilibria in markets with indivisible items [1]
- It was able to obtain high value in much fewer rounds than the traditional GREEDY, which can be seen in the gap of performance between GROSS SUBSTITUTES ADAPTIVE SEQUENCING (GSAS) and TRIMMED GREEDY
- We found that for smaller k, GSAS needed as many rounds as GREEDY to terminate so that the performance of both algorithms is near equivalent for k smaller than 80
- For larger values of k, GSAS terminated in much fewer rounds (Figures 1e, 1f, 1g, 1h)
- While previous work has been done on maximizing submodular functions, a superclass of gross substitutes, little is known about the adaptivity complexity to achieve optimal results for this particular class of functions

Results

- The authors describe an algorithm for maximizing gross substitutes functions which has low adaptivity and returns a solution whose approximation guarantee is arbitrarily close to optimal.
- The authors show that the stochastic greedy algorithm can guarantee a strong approximation to the optimal solution of gross substitutes functions.
- The authors describe an algorithm for maximizing gross substitutes functions, GROSS SUBSTITUTES ADAPTIVE SEQUENCING (GSAS), which has O(log n) rounds and returns a solution whose approximation guarantee is arbitrarily close to optimal.
- Since gross substitutes functions are submodular, adaptive sampling provides a 1 − 1/e approximation but fails to give near-optimal guarantees.
- For any monotone gross substitutes function f and > 0, GSAS is a O(log(n)/ 3) adaptive algorithm that returns a set S such that E[f (S)] ≥ (1 − O( ))OPT.
- This lower bound shows a sharp separation between gross substitutes and additive and unit demand functions, which can be optimized to be arbitrarily close to 1 in just one round.
- The authors show that there is no o(log n) adaptive algorithm that obtains a constant approximation for maximizing OXS functions when the queries are of size O(k).
- 1 log n approximation for maximizing monotone gross substitutes functions under a cardinality constraint when the queries are sets of size O(k).

Conclusion

- Spectrum of UD-additivity In the two extremes where the OXS valuation is strictly additive or unit-demand (UD), TOP-K performs optimally by selecting the elements with the highest marginal contribution to the empty set in one round.
- While previous work has been done on maximizing submodular functions, a superclass of gross substitutes, little is known about the adaptivity complexity to achieve optimal results for this particular class of functions.
- The authors' results show an exponentially faster algorithm with near-optimal approximation guarantees for optimization of gross substitute valuations, which have numerous applications in microeconomics and market design [2, 33, 3, 23, 25] and appear in multiple fields such as discrete mathematics [28] and number theory [14].

Summary

- The authors study the problem of maximizing gross substitutes functions in the adaptive complexity model.
- It is well known that a greedy algorithm that iteratively selects the element with the maximal marginal contribution to its current solution obtains a 1 − 1/e approximation for maximization under a cardinality constraint [30] and that this bound is optimal for polynomial-time algorithms [29, 19].
- The authors first show that the number of rounds needed to find a solution that is arbitrarily close to optimal for maximizing monotone gross substitutes under a cardinality constraint is O(log n).
- The authors describe an algorithm for maximizing gross substitutes functions which has low adaptivity and returns a solution whose approximation guarantee is arbitrarily close to optimal.
- The authors show that the stochastic greedy algorithm can guarantee a strong approximation to the optimal solution of gross substitutes functions.
- The authors describe an algorithm for maximizing gross substitutes functions, GROSS SUBSTITUTES ADAPTIVE SEQUENCING (GSAS), which has O(log n) rounds and returns a solution whose approximation guarantee is arbitrarily close to optimal.
- Since gross substitutes functions are submodular, adaptive sampling provides a 1 − 1/e approximation but fails to give near-optimal guarantees.
- For any monotone gross substitutes function f and > 0, GSAS is a O(log(n)/ 3) adaptive algorithm that returns a set S such that E[f (S)] ≥ (1 − O( ))OPT.
- This lower bound shows a sharp separation between gross substitutes and additive and unit demand functions, which can be optimized to be arbitrarily close to 1 in just one round.
- The authors show that there is no o(log n) adaptive algorithm that obtains a constant approximation for maximizing OXS functions when the queries are of size O(k).
- 1 log n approximation for maximizing monotone gross substitutes functions under a cardinality constraint when the queries are sets of size O(k).
- Spectrum of UD-additivity In the two extremes where the OXS valuation is strictly additive or unit-demand (UD), TOP-K performs optimally by selecting the elements with the highest marginal contribution to the empty set in one round.
- While previous work has been done on maximizing submodular functions, a superclass of gross substitutes, little is known about the adaptivity complexity to achieve optimal results for this particular class of functions.
- The authors' results show an exponentially faster algorithm with near-optimal approximation guarantees for optimization of gross substitute valuations, which have numerous applications in microeconomics and market design [2, 33, 3, 23, 25] and appear in multiple fields such as discrete mathematics [28] and number theory [14].

Funding

- Ron Kupfer - This project has received funding from the European Research Council (ERC) under the European Union’s Horizon 2020 research and innovation program (grant agreement No 740282)
- Yaron Singer - This research was supported by BSF grant 2014389, NSF grant CAREER CCF1452961, NSF USICCS proposal 1540428, Google research award, and a Facebook research award

Study subjects and analysis

tweets: 500

We filter Twitter data for specific hashtags and extract keywords from each tweet. For each hashtag, we use roughly 500 tweets to construct a bipartite graph with “players" representing advertisements and “items" representing keywords. The valuation of the keyword is determined by the length of the tweet and the popularity of the keyword

Reference

- Kenneth J Arrow and Gerard Debreu. Existence of an equilibrium for a competitive economy. Econometrica: Journal of the Econometric Society, pages 265–290, 1954.
- John William H as matroid rank functions, gross substitutes. atfield, and Paul R. Milgrom. Matching with Contracts. American Economic Review, 95(4):913–935, September 2005.
- Lawrence M Ausubel and Paul R Milgrom. Ascending auctions with package bidding. Advances in Theoretical Economics, 1(1), 2002.
- Maria Florina Balcan, Florin Constantin, Satoru Iwata, and Lei Wang. Learning valuation functions. arXiv preprint arXiv:1108.5669, 2011.
- Eric Balkanski, Adam Breuer, and Yaron Singer. Non-monotone submodular maximization in exponentially fewer iterations. NeurIPS, 2018.
- Eric Balkanski, Aviad Rubinstein, and Yaron Singer. An exponential speedup in parallel running time for submodular maximization without loss in approximation. SODA, 2019.
- Eric Balkanski, Aviad Rubinstein, and Yaron Singer. An optimal approximation for submodular maximization under a matroid constraint in the adaptive complexity model. STOC, 2019.
- Eric Balkanski and Yaron Singer. The adaptive complexity of maximizing a submodular function. In Proceedings of the 50th Annual ACM SIGACT Symposium on Theory of Computing, pages 1138–1151. ACM, 2018.
- Eric Balkanski and Yaron Singer. Approximation guarantees for adaptive sampling. In International Conference on Machine Learning, pages 393–402, 2018.
- Chandra Chekuri and Kent Quanrud. Parallelizing greedy for submodular set function maximization in matroids and beyond. STOC, 2019.
- Chandra Chekuri and Kent Quanrud. Submodular function maximization in parallel via the multilinear relaxation. SODA, 2019.
- Lin Chen, Moran Feldman, and Amin Karbasi. Unconstrained submodular maximization with constant adaptive complexity. STOC, 2019.
- Andreas WM Dress and Werner Terhalle. Well-layered maps?a class of greedily optimizable set functions. Applied Mathematics Letters, 8(5):77–80, 1995.
- Andreas W.M. Dress and Walter Wenzel. Valuated matroids: a new look at the greedy algorithm. Applied Mathematics Letters, 3(2):33 – 35, 1990.
- Alina Ene and Huy L Nguyen. Submodular maximization with nearly-optimal approximation and adaptivity in nearly-linear time. SODA, 2019.
- Alina Ene, Huy L Nguyen, and Adrian Vladu. Submodular maximization with packing constraints in parallel. STOC, 2019.
- Matthew Fahrbach, Vahab Mirrokni, and Morteza Zadimoghaddam. Non-monotone submodular maximization with nearly optimal adaptivity complexity. SODA, 2019.
- Matthew Fahrbach, Vahab Mirrokni, and Morteza Zadimoghaddam. Submodular maximization with optimal approximation, adaptivity and query complexity. SODA, 2019.
- Uriel Feige. A threshold of ln n for approximating set cover. Journal of the ACM (JACM), 45(4):634–652, 1998.
- Faruk Gul and Ennio Stacchetti. Walrasian equilibrium with gross substitutes. Journal of Economic Theory, 87(1):95–124, July 1999.
- Avinatan Hassidim and Yaron Singer. Submodular optimization under noise. arXiv preprint arXiv:1601.03095, 2016.
- Avinatan Hassidim and Yaron Singer. Robust guarantees of stochastic greedy algorithms. In Proceedings of the 34th International Conference on Machine Learning-Volume 70, pages 1424–1432. JMLR. org, 2017.
- John William Hatfield, Scott Duke Kominers, Alexandru Nichifor, Michael Ostrovsky, and Alexander Westkamp. Stability and competitive equilibrium in trading networks. Journal of Political Economy, 121(5):966–1005, 2013.
- Thibaut Horel and Yaron Singer. Maximization of approximately submodular functions. In Advances in Neural Information Processing Systems, pages 3045–3053, 2016.
- Yoshiko T Ikebe, Yosuke Sekiguchi, Akiyoshi Shioura, and Akihisa Tamura. Stability and competitive equilibria in multi-unit trading networks with discrete concave utility functions. Japan Journal of Industrial and Applied Mathematics, 32(2):373–410, 2015.
- Ehsan Kazemi, Marko Mitrovic, Morteza Zadimoghaddam, Silvio Lattanzi, and Amin Karbasi. Submodular streaming in all its glory: Tight approximation, minimum memory and low adaptive complexity. ICML, 2019.
- Benny Lehmann, Daniel Lehmann, and Noam Nisan. Combinatorial auctions with decreasing marginal utilities. In Proceedings of the 3rd ACM conference on Electronic Commerce, pages 18–28. ACM, 2001.
- Kazuo Murota and Akiyoshi Shioura. M-convex function on generalized polymatroid. Mathematics of Operations Research, 24(1):pp. 95–105, 1999.
- George L Nemhauser and Laurence A Wolsey. Best algorithms for approximating the maximum of a submodular set function. Mathematics of operations research, 3(3):177–188, 1978.
- George L Nemhauser, Laurence A Wolsey, and Marshall L Fisher. An analysis of approximations for maximizing submodular set functions—i. Mathematical Programming, 14(1):265– 294, 1978.
- Renato Paes Leme. Gross substitutability: An algorithmic survey. Games and Economic Behavior, 106:294–316, 2017.
- Chao Qian, Jing-Cheng Shi, Yang Yu, Ke Tang, and Zhi-Hua Zhou. Subset selection under noise. In Advances in Neural Information Processing Systems, pages 3560–3570, 2017.
- Alvin E. Roth. Stability and polarization of interests in job matching. Econometrica, 52(1):47– 57, 1984.
- Adish Singla, Sebastian Tschiatschek, and Andreas Krause. Noisy submodular maximization via adaptive sampling with applications to crowdsourced image collection summarization. 2016.

Tags

Comments