微前端架构如何改变企业的开发模式与效率提升
943
2022-11-29
ELK Logstash Introduction
Logstash Introduction
Logstash is an open source data collection engine with real-time pipelining capabilities. Logstash can dynamically unify data from disparate sources and normalize the data into destinations of your choice. Cleanse and democratize all your data for diverse advanced downstream analytics and visualization use cases.
Logstash是具有实时流水线功能的开源数据收集引擎。Logstash可以动态统一来自不同来源的数据,并将数据标准化到您选择的目标位置。清除所有数据并使其民主化,以用于各种高级下游分析和可视化用例。
While Logstash originally drove innovation in log collection, its capabilities extend well beyond that use case. Any type of event can be enriched and transformed with a broad array of input, filter, and output plugins, with many native codecs further simplifying the ingestion process. Logstash accelerates your insights by harnessing a greater volume and variety of data.
虽然Logstash最初推动了日志收集方面的创新,但其功能远远超出了该用例。任何类型的事件都可以通过各种各样的输入,过滤器和输出插件来丰富和转换,许多本机编解码器进一步简化了提取过程。Logstash通过利用更大数量和更多种类的数据来加快您的见解。
The Power of Logstash
The ingestion workhorse for Elasticsearch and more
Horizontally scalable data processing pipeline with strong Elasticsearch and Kibana synergy
具有强大的Elasticsearch和Kibana协同功能的水平可扩展数据处理管道
Pluggable pipeline architecture
Mix, match, and orchestrate different inputs, filters, and outputs to play in pipeline harmony
混合,匹配和编排不同的输入,过滤器和输出以协调管道
Community-extensible and developer-friendly plugin ecosystem
Over 200 plugins available, plus the flexibility of creating and contributing your own
超过200个可用的插件,以及创建和贡献自己的灵活性
Logstash Loves Data
Collect more, so you can know more. Logstash welcomes data of all shapes and sizes.
Logs and Metrics
Where it all started.
Handle all types of logging data
Easily ingest a multitude of web logs likeApache, and application logs likelog4j for JavaCapture many other log formats likesyslog, networking and firewall logs, and more
Enjoy complementary secure log forwarding capabilities withFilebeatCollect metrics fromGanglia,collectd,NetFlow,JMX, and many other infrastructure and application platforms overTCP andUDP
The Web
Unlock the World Wide Web.
TransformHTTP requests into events
Consume from web service firehoses likeTwitter for social sentiment analysisWebhook support for GitHub, HipChat, JIRA, and countless other applicationsEnables manyWatcher alerting use cases
Create events by pollingHTTP endpoints on demand
Universally capture health, performance, metrics, and other types of data from web application interfacesPerfect for scenarios where the control of polling is preferred over receiving
Data Stores and Streams
Discover more value from the data you already own.
Better understand your data from any relational database or NoSQL store with aJDBC interfaceUnify diverse data streams from messaging queues like ApacheKafka,RabbitMQ, andAmazon SQS
Sensors and IoT
Explore an expansive breadth of other data.
In this age of technological advancement, the massive IoT world unleashes endless use cases through capturing and harnessing data from connected sensors.Logstash is the common event collection backbone for ingestion of data shipped from mobile devices to intelligent homes, connected vehicles, healthcare sensors, and many other industry specific applications.
Easily Enrich Everything
The better the data, the better the knowledge. Clean and transform your data during ingestion to gain near real-time insights immediately at index or output time. Logstash comes out-of-box with many aggregations and mutations along with pattern matching, geo mapping, and dynamic lookup capabilities.
Grok is the bread and butter of Logstash filters and is used ubiquitously to derive structure out of unstructured data. Enjoy a wealth of integrated patterns aimed to help quickly resolve web, systems, networking, and other types of event formats.(Grok是Logstash过滤器的基础,广泛用于从非结构化数据中导出结构。享受多种旨在帮助快速解决Web,系统,网络和其他类型事件格式的集成模式。)Expand your horizons by decipheringgeo coordinates from IP addresses, normalizingdate complexity, simplifyingkey-value pairs andCSV data,fingerprinting(anonymizing) sensitive information, and further enriching your data withlocal lookups or Elasticsearchqueries.(通过从IP地址解密地理坐标,标准化日期复杂性,简化键值对和CSV数据,对敏感信息进行指纹识别(匿名化),以及通过本地查找或Elasticsearch查询进一步丰富数据,来扩展您的视野。)Codecs are often used to ease the processing of common event structures likeJSON andmultiline events.(编解码器通常用于简化对常见事件结构(如JSON 和多行事件)的处理。)
See Transforming Data for an overview of some of the popular data processing plugins.
版权声明:本文内容由网络用户投稿,版权归原作者所有,本站不拥有其著作权,亦不承担相应法律责任。如果您发现本站中有涉嫌抄袭或描述失实的内容,请联系我们jiasou666@gmail.com 处理,核实后本网站将在24小时内删除侵权内容。
发表评论
暂时没有评论,来抢沙发吧~