Chrome Extension
WeChat Mini Program
Use on ChatGLM

Back-to-the-Future Whois: an IP Address Attribution Service for Working with Historic Datasets.

PASSIVE AND ACTIVE MEASUREMENT, PAM 2023(2023)

Max Planck Inst Informat | TU Wien | Delft Univ Technol

Cited 4|Views35
Abstract
Researchers and practitioners often face the issue of having to attribute an IP address to an organization. For current data this is comparably easy, using services like whois or other databases. Similarly, for historic data, several entities like the RIPE NCC provide websites that provide access to historic records. For large-scale network measurement work, though, researchers often have to attribute millions of addresses. For current data, Team Cymru provides a bulk whois service which allows bulk address attribution. However, at the time of writing, there is no service available that allows historic bulk attribution of IP addresses. Hence, in this paper, we introduce and evaluate our 'Back-to-the-Future whois' service, allowing historic bulk attribution of IP addresses on a daily granularity based on CAIDA Routeviews aggregates. We provide this service to the community for free, and also share our implementation so researchers can run instances themselves.
More
Translated text
Key words
Named Data Networking
PDF
Bibtex
AI Read Science
AI Summary
AI Summary is the key point extracted automatically understanding the full text of the paper, including the background, methods, results, conclusions, icons and other key content, so that you can get the outline of the paper at a glance.
Example
Background
Key content
Introduction
Methods
Results
Related work
Fund
Key content
  • Pretraining has recently greatly promoted the development of natural language processing (NLP)
  • We show that M6 outperforms the baselines in multimodal downstream tasks, and the large M6 with 10 parameters can reach a better performance
  • We propose a method called M6 that is able to process information of multiple modalities and perform both single-modal and cross-modal understanding and generation
  • The model is scaled to large model with 10 billion parameters with sophisticated deployment, and the 10 -parameter M6-large is the largest pretrained model in Chinese
  • Experimental results show that our proposed M6 outperforms the baseline in a number of downstream tasks concerning both single modality and multiple modalities We will continue the pretraining of extremely large models by increasing data to explore the limit of its performance
Try using models to generate summary,it takes about 60s
Must-Reading Tree
Example
Generate MRT to find the research sequence of this paper
Related Papers

Lowering the Barriers to Working with Public RIR-Level Data.

PROCEEDINGS OF THE 2023 APPLIED NETWORKING RESEARCH WORKSHOP, ANRW 2023 2023

被引用0

Crisis, Ethics, Reliability & a Measurement.network: Reflections on Active Network Measurements in Academia.

PROCEEDINGS OF THE 2023 APPLIED NETWORKING RESEARCH WORKSHOP, ANRW 2023 2023

被引用0

Domain and Website Attribution Beyond WHOIS

39TH ANNUAL COMPUTER SECURITY APPLICATIONS CONFERENCE, ACSAC 2023 2023

被引用0

Data Disclaimer
The page data are from open Internet sources, cooperative publishers and automatic analysis results through AI technology. We do not make any commitments and guarantees for the validity, accuracy, correctness, reliability, completeness and timeliness of the page data. If you have any questions, please contact us by email: report@aminer.cn
Chat Paper

要点】:本文介绍了“Back-to-the-Future Whois”服务,这是一种创新的IP地址归属服务,能够对历史数据集进行批量IP地址归属。

方法】:该服务基于CAIDA Routeviews聚合数据,实现了对历史IP地址的每日粒度归属。

实验】:作者评估了该服务,并提供了免费服务及实现代码,允许研究者自行运行实例,实验使用了CAIDA Routeviews数据集。