This is an automated email from the ASF dual-hosted git repository.

sivabalan pushed a commit to branch asf-site
in repository https://gitbox.apache.org/repos/asf/hudi.git


The following commit(s) were added to refs/heads/asf-site by this push:
     new aa95b5c  [MINOR] Updating powered-by page (#4727)
aa95b5c is described below

commit aa95b5cdab5731231dc1c960a1fee846bdf25f6e
Author: Kyle Weller <[email protected]>
AuthorDate: Mon Jan 31 15:53:26 2022 -0800

    [MINOR] Updating powered-by page (#4727)
---
 ...2022-01-06-apache-hudi-2021-a-year-in-review.md |   2 +-
 website/src/pages/powered-by.md                    | 204 +++++----------------
 website/static/assets/images/powers/logo-wall.png  | Bin 0 -> 450292 bytes
 3 files changed, 45 insertions(+), 161 deletions(-)

diff --git a/website/blog/2022-01-06-apache-hudi-2021-a-year-in-review.md 
b/website/blog/2022-01-06-apache-hudi-2021-a-year-in-review.md
index c31e3cd..da65662 100644
--- a/website/blog/2022-01-06-apache-hudi-2021-a-year-in-review.md
+++ b/website/blog/2022-01-06-apache-hudi-2021-a-year-in-review.md
@@ -14,7 +14,7 @@ As the year came to end, I took some time to reflect on where 
we are and what we
 
 I want to call out how amazing it is to see such a diverse group of people 
step up and contribute to this project. There were over 30,000 interactions 
with the [project on github](https://github.com/apache/hudi/), up 2x from last 
year. Over the last year 300 people have contributed to the project, with over 
3,000 PRs over 5 releases. We moved Apache Hudi from release 0.5.X all the way 
to our feature packed 0.10.0 release. Come and join us on our [active slack 
channel](https://join.slack. [...]
 
-<img src="/assets/images/Hudi_logos.png" alt="drawing" width="600"/>
+<img src="/assets/images/powers/logo-wall.png" alt="drawing" width="600"/>
 
 **_Impact_**
 
diff --git a/website/src/pages/powered-by.md b/website/src/pages/powered-by.md
index 01acd40..89341aa 100644
--- a/website/src/pages/powered-by.md
+++ b/website/src/pages/powered-by.md
@@ -6,210 +6,94 @@ last_modified_at: 2019-12-31T15:59:57-04:00
 
 # Who's Using
 
-## Adoption
+Apache Hudi is a [fast growing diverse 
community](https://hudi.apache.org/blog/2022/01/06/apache-hudi-2021-a-year-in-review)
 
+of people and organizations from all around the globe. The following is a 
small sample of companies that have adopted or 
+contributed to the Apache Hudi community! [Join us on 
slack](https://join.slack.com/t/apache-hudi/shared_invite/enQtODYyNDAxNzc5MTg2LTE5OTBlYmVhYjM0N2ZhOTJjOWM4YzBmMWU2MjZjMGE4NDc5ZDFiOGQ2N2VkYTVkNzU3ZDQ4OTI1NmFmYWQ0NzE),
 
+or come to one of our [virtual community 
events](https://hudi.apache.org/community/syncs).
 
-<img src="/assets/images/powers/alibaba.png" alt="drawing"/>
+<img src="/assets/images/powers/logo-wall.png" alt="drawing"/>
 
-<br/>
+### 37 Interactive Entertainment
+[37 Interactive Entertainment](https://www.37wan.net/) is a global Top20 
listed game company, and a leading company on A-shares market of China.
+Apache Hudi is integrated into our Data Middle Platform offering real-time 
data warehouse and solving the problem of frequent changes of data.
+Meanwhile, we build a set of data access standards based on Hudi, which 
provides a guarantee for massive data queries in game operation scenarios.
 
-### Alibaba Cloud  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 
+### Alibaba Cloud 
 Alibaba Cloud provides cloud computing services to online businesses and 
Alibaba's own e-commerce ecosystem, Apache Hudi is integrated into Alibaba 
Cloud [Data Lake Analytics](https://www.alibabacloud.com/help/product/70174.htm)
 offering real-time analysis on hudi dataset.
 
-<img src="/assets/images/powers/aws.jpg" alt="drawing" width="100"/>
-<br/>
+### Amazon
+[Amazon Transportation 
service](https://aws.amazon.com/blogs/big-data/how-amazon-transportation-service-enabled-near-real-time-event-analytics-at-petabyte-scale-using-aws-glue-with-apache-hudi/)
 
+uses Apache Hudi for the backbone of their package delivery system, powering 
petabyte-scale near real time analytics.
 
-### Amazon Web Services &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;  
+### Amazon Web Services  
 Amazon Web Services is the World's leading cloud services provider. Apache 
Hudi is [pre-installed](https://aws.amazon.com/emr/features/hudi/) with the AWS 
Elastic Map Reduce 
 offering, providing means for AWS users to perform record-level 
updates/deletes and manage storage efficiently.
 
-<img src="/assets/images/powers/clinbrain.png" alt="drawing"/>
-<br/>
+### ByteDance
+[ByteDance](https://hudi.apache.org/blog/2021/09/01/building-eb-level-data-lake-using-hudi-at-bytedance/)
 
+uses Apache Hudi to power their Exabyte scale TikTok #ForYouPage realtime 
recommendation engine.
 
-### Clinbrain &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 
+### Clinbrain
 [Clinbrain](https://www.clinbrain.com/)  is the leader of big data platform 
and usage in medical industry. We have built 200 medical big data centers by 
integrating Hudi Data Lake solution in numerous hospitals. Hudi provides the 
ability to upsert and delete on hdfs, at the same time, it can make the fresh 
data-stream up-to-date efficiently in hadoop system with the hudi incremental 
view.
 
-<img src="/assets/images/powers/emis.jpg" alt="drawing"/>
-<br/>
+### Disney+ Hotstar
+[Disney shared](https://youtu.be/ZamXiT9aqs8) how they migrated CDC data to 
Apache Hudi to power a real-time ads platform for their streaming service. 
 
-### EMIS Health &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 
+### EMIS Health 
 [EMIS Health](https://www.emishealth.com/) is the largest provider of Primary 
Care IT software in the UK with datasets including more than 500Bn healthcare 
records. HUDI is used to manage their analytics dataset in production and 
keeping them up-to-date with their upstream source. Presto is being used to 
query the data written in HUDI format.
 
-<img src="/assets/images/powers/grofers.png" alt="drawing" width="150"/>
-<br/>
+### GE Aviation
+[GE Aviation built cloud-native data pipelines at enterprise scale using 
Apache Hudi in AWS 
platform](https://aws.amazon.com/blogs/big-data/how-ge-aviation-built-cloud-native-data-pipelines-at-enterprise-scale-using-the-aws-platform/)
 
-### Grofers &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 
+### Grofers 
 [Grofers](https://grofers.com) is a grocery delivery provider operating across 
APAC region. Grofers has [integrated 
hudi](https://lambda.grofers.com/origins-of-data-lake-at-grofers-6c011f94b86c) 
in its central pipelines for replicating backend database CDC into the 
warehouse.
 
-<img src="/assets/images/powers/H3C.JPG" alt="drawing" width="150"/>
-<br/>
-
-### H3C Digital Platform &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 
+### H3C Digital Platform 
 [H3C digital platform](http://www.h3c.com/) provides the whole process 
capability of data collection, storage, calculation and governance, and enables 
the construction of data center and data governance ability for medical, smart 
park, smart city and other industries;
 Apache Hudi is integrated in the digital platform to meet the real-time update 
needs of massive data
 
-<img src="/assets/images/powers/kyligence.png" alt="drawing"/>
-<br/>
+### Halodoc
+[Lake House Architecture at Halodoc: Data Platform 
2.0](https://blogs.halodoc.io/lake-house-architecture-halodoc-data-platform-2-0/)
 
-### Kyligence &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 
+### Kyligence 
 [Kyligence](https://kyligence.io/zh/) is the leading Big Data analytics 
platform company. We’ve built end to end solutions for various Global Fortune 
500 companies in US and China. We adopted Apache Hudi in our Cloud solution on 
AWS in 2019. With the help of Hudi, we are able to process upserts and deletes 
easily and we use incremental views to build efficient data pipelines in AWS. 
The Hudi datasets can also be integrated to Kyligence Cloud directly for high 
concurrent OLAP access.
 
-<img src="/assets/images/powers/lingyue.png" alt="drawing" width="100"/>
-<br/>
-
-### Lingyue-digital Corporation &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 
+### Lingyue-digital Corporation 
 [Lingyue-digital Corporation](https://www.lingyue-digital.com/) belongs to BMW 
Group. Apache Hudi is used to perform ingest MySQL and PostgreSQL change data 
capture. We build up upsert scenarios on Hadoop and spark.
 
-<img src="/assets/images/powers/hopsworks.png" alt="drawing" width="200"/>
-<br/>
-
-### Logical Clocks &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 
+### Logical Clocks 
 [Hopsworks 1.x 
series](https://www.logicalclocks.com/blog/introducing-the-hopsworks-1-x-series)
 supports Apache Hudi feature groups, to enable upserts and time travel.
 
-<img src="/assets/images/powers/shunfeng.png" alt="drawing" width="200"/>
-<br/>
-
-### SF-Express &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 
+### Robinhood
+[Rds data lake at Robinhood using Apache 
Hudi](https://www.slideshare.net/BalajiVaradarajan13/rds-data-lake-robinhood)
 
+### SF-Express 
 [SF-Express](https://www.sf-express.com/cn/sc/) is the leading logistics 
service provider in China. HUDI is used to build a real-time data warehouse, 
providing real-time computing solutions with higher efficiency and lower cost 
for our business.
 
-<img src="/assets/images/powers/tathastu.png" alt="drawing" width="150"/>
-<br/>
-
-### Tathastu.ai &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 
-
+### Tathastu.ai 
 [Tathastu.ai](https://www.tathastu.ai) offers the largest AI/ML playground of 
consumer data for data scientists, AI experts and technologists to build upon. 
They have built a CDC pipeline using Apache Hudi and Debezium. Data from Hudi 
datasets is being queried using Hive, Presto and Spark.
 
-<img src="/assets/images/powers/qq.png" alt="drawing" width="100"/>
-<br/>
-
-### Tencent &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 
-
+### Tencent 
 [EMR from Tencent](https://intl.cloud.tencent.com/product/emr) Cloud has 
integrated Hudi as one of its BigData components [since 
V2.2.0](https://intl.cloud.tencent.com/document/product/1026/35587). Using 
Hudi, the end-users can handle either read-heavy or write-heavy use cases, and 
Hudi will manage the underlying data stored on HDFS/COS/CHDFS using Apache 
Parquet and Apache Avro.
 
-<img src="/assets/images/powers/uber.png" alt="drawing" width="100"/>
-<br/>
-
-### Uber &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 
-
+### Uber 
 Apache Hudi was originally developed at [Uber](https://uber.com), to achieve 
[low latency database ingestion, with high 
efficiency](http://www.slideshare.net/vinothchandar/hadoop-strata-talk-uber-your-hadoop-has-arrived/32).
 It has been in production since Aug 2016, powering the massive [100PB data 
lake](https://eng.uber.com/uber-big-data-platform/), including highly business 
critical tables like core trips,riders,partners. It also 
 powers several incremental Hive ETL pipelines and being currently integrated 
into Uber's data dispersal system.
 
-<img src="/assets/images/powers/udemy.png" alt="drawing" width="100"/>
-<br/>
-
-### Udemy &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 
-
+### Udemy 
 At [Udemy](https://www.udemy.com/), Apache Hudi on AWS EMR is used to perform 
ingest MySQL change data capture.
 
-<img src="/assets/images/powers/yield.png" alt="drawing" width="100"/>
-<br/>
-
-### Yields.io &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 
+### Walmart
+[Walmart](https://searchdatamanagement.techtarget.com/feature/Hudi-powering-data-lake-efforts-at-Walmart-and-Disney-Hotstar)
 
+chose Apache Hudi to manage their data lake of store transactions.
 
+### Yields.io 
 Yields.io is the first FinTech platform that uses AI for automated model 
validation and real-time monitoring on an enterprise-wide scale. Their [data 
lake](https://www.yields.io/Blog/Apache-Hudi-at-Yields) is managed by Hudi. 
They are also actively building their infrastructure for incremental, cross 
language/platform machine learning using Hudi.
 
-<img src="/assets/images/powers/yotpo.png" alt="drawing" width="80"/>
-<br/>
-
-### Yotpo &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 
-
+### Yotpo
 Using Hudi at Yotpo for several usages. Firstly, integrated Hudi as a writer 
in their open source ETL framework, 
[Metorikku](https://github.com/YotpoLtd/metorikku) and using as an output 
writer for a CDC pipeline, with events that are being generated from a database 
binlog streams to Kafka and then are written to S3. 
 
-<img src="/assets/images/powers/zendesk.png" alt="drawing" width="80"/>
-<br/>
-
-### Zendesk &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 
-
-At [Zendesk](https://www.zendesk.com/), Apache Hudi is adopted for building 
Data Lake on AWS.
-
- <img src="/assets/images/powers/37.PNG" alt="drawing" width="100"/>
-<br/>
-
-### 37 Interactive Entertainment &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp;
-
-[37 Interactive Entertainment](https://www.37wan.net/) is a global Top20 
listed game company, and a leading company on A-shares market of China.
-Apache Hudi is integrated into our Data Middle Platform offering real-time 
data warehouse and solving the problem of frequent changes of data.
-Meanwhile, we build a set of data access standards based on Hudi, which 
provides a guarantee for massive data queries in game operation scenarios.
-
-<img src="/assets/images/powers/robinhood.png" alt="drawing" width="200"/>
-<br/>
-
-### Robinhood  &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 
-[Rds data lake at Robinhood using Apache 
Hudi](https://www.slideshare.net/BalajiVaradarajan13/rds-data-lake-robinhood)
-
-<img src="/assets/images/powers/bytedance.png" alt="drawing" width="200"/>
-<br/>
-
-### ByteDance &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 
-[Building an ExaByte-level Data Lake Using Apache 
Hudi](https://hudi.apache.org/blog/2021/09/01/building-eb-level-data-lake-using-hudi-at-bytedance/)
-
-<img src="/assets/images/powers/amazon.png" alt="drawing" width="150"/>
-<br/>
-
-### Amazon 
-[Amazon Transportation Service enabled near-real-time event analytics at 
petabyte scale using AWS Glue with Apache 
Hudi](https://aws.amazon.com/blogs/big-data/how-amazon-transportation-service-enabled-near-real-time-event-analytics-at-petabyte-scale-using-aws-glue-with-apache-hudi/)
-
-<img src="/assets/images/powers/walmart.png" alt="drawing" width="200"/>
-<br/>
-
-### Walmart &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 
-[Walmart uses Apache Hudi to build their Data 
Lake](https://searchdatamanagement.techtarget.com/feature/Hudi-powering-data-lake-efforts-at-Walmart-and-Disney-Hotstar)
-
-<img src="/assets/images/powers/disney_hotstar.png" alt="drawing" width="100"/>
-<br/>
-
-### Disney+Hotstar &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 
-[Migrating CDC data to Apache Hudi](https://youtu.be/ZamXiT9aqs8)
-
-<img src="/assets/images/powers/GE_aviation.png" alt="drawing" width="200"/>
-
-### GE Aviation &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 
-[GE Aviation built cloud-native data pipelines at enterprise scale using 
Apache Hudi in AWS 
platform](https://aws.amazon.com/blogs/big-data/how-ge-aviation-built-cloud-native-data-pipelines-at-enterprise-scale-using-the-aws-platform/)
-
-<img src="/assets/images/powers/halodoc.png" alt="drawing" width="150"/>
-<br/>
-
-### Halodoc &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 
-[Lake House Architecture at Halodoc: Data Platform 
2.0](https://blogs.halodoc.io/lake-house-architecture-halodoc-data-platform-2-0/)
-
-<img src="/assets/images/powers/google_cloud.png" alt="drawing" width="200"/>
-<br/>
-
-### Google Cloud &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 
-
-<img src="/assets/images/powers/moveworks.png" alt="drawing" width="200"/>
-<br/>
-
-### Moveworks &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 
-
-<img src="/assets/images/powers/DoubleVerify.png" alt="drawing" width="200"/>
-<br/>
-
-### Double Verify &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 
-
-<img src="/assets/images/powers/cirium.png" alt="drawing" width="200"/>
-<br/>
-
-### Cirium &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 
-
-<img src="/assets/images/powers/bilibili.png" alt="drawing" width="200"/>
-<br/>
-
-### Bilibili &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 
-
-<img src="/assets/images/powers/ai_bank.png" alt="drawing" width="200"/>
-<br/>
-
-### AI Bank &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 
-
-<img src="/assets/images/powers/huawei.png" alt="drawing" width="150"/>
-<br/>
-
-### Huawei &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 
-
-<img src="/assets/images/powers/tongcheng.png" alt="drawing" width="150"/>
-<br/>
-
-### Tongcheng &nbsp;&nbsp;&nbsp;&nbsp;&nbsp;&nbsp; 
+### Zendesk 
+At [Zendesk](https://www.zendesk.com/), Apache Hudi is adopted for building 
Data Lake on AWS.
\ No newline at end of file
diff --git a/website/static/assets/images/powers/logo-wall.png 
b/website/static/assets/images/powers/logo-wall.png
new file mode 100644
index 0000000..39ff007
Binary files /dev/null and b/website/static/assets/images/powers/logo-wall.png 
differ

Reply via email to