Skip to content

2022

越南街頭美食

今天,我將評論一下越南街頭美食。這篇文章中展示的所有照片都是我在旅行期間拍攝的。我將展示我在旅程中享用的道地越南街頭美食。

越戰時期的食物

在Củ Chi隧道中

越戰時期的食物

我和一位素食的同事旅行了兩個星期。由於她的飲食限制,我們所有的餐點都是素食。誠實地說,我覺得素食菜肴有點失望。這張照片描繪的是越戰時期的食物—乾燥、無味的馬鈴薯。當時的世貞現實是,人們餓得並無選擇。一位老人告訴我,他甚至為了生存而不得不吃狗。在這種情況下最重要的是不要進行評判。

摩托車當地街頭美食之旅

摩托車遊覽

只有在我的同事不在的週末,我才能盡情享用正宗的街頭美食。最地道的享用街頭美食的方法就是騎摩托車。在越南駕駛本身就是一種冒險;交通規則更像是建議。人們說,如果你能在越南騎摩托車,你就能在世界任何地方開車。謝天謝地,我並未受到任何傷害。

Bánh Tráng Nướng(越南披薩)

越南披薩

越南披薩是另一種受歡迎的菜肴。它是用米紙做的,吃起來輕盈易嚼。正宗的越南披薩是在開放的火焰上烹製的,而完全不顧安全規定。最終的結果簡直讓人愉快。

Bánh Xèo(越南餅乾)

越南餅乾

下面是越南的餅乾,這是在真火上的鍋子中烹製的。餅乾裡為滿是各種香料像是番薯葉, 讓這餅乾充滿香氣。

Bột Chiên(炸飯粉餅上加木瓜)

炸飯粉餅

雖然炒飯是我的最愛,但這道菜帶來了更多的享受。這基本上是炸飯粉餅,它很美味。

Nem Bò Nướng Sả(煙熏甜味和鹹味的豬肉串)

豬肉串

豬肉串也是很有人氣的。豬可能很聰明,但他們也非常美味。

Bánh Mì(越南法棍)

越南法棍麵包

法國在越南的殖民影響留下了美味的越南法棍,也被稱為Bánh Mì。

Bún Bò Huế(來自順化的牛肉麵湯)

CO AN, 001 BIS C/C NGUYEN THIEN THUAT, P.1 Q3, TPHCM

來自順化的牛肉麵湯

Bún Bò Huế是一種稍微有點辣的菜。Hue是這道菜發源城市的名字。對素食者深感抱歉,但它實在太好吃了。在越南,大部分人並不懂素食主義;他們甚至會在沙拉中加魚露。

Bún Cả (魚餅)

魚餅

Bún Cả是一種令人愉快的魚餅菜。當你加入魚露時,它變得更有風味。

更多

各種食物

還有更多美味的食物等著你去嘗試。

Chuối Nếp Nướng (香蕉糯米飯)

378 VO VAN TAN, P.5. QUAN 3, TP. HCM

香蕉糯米飯

甜點 我推薦香蕉糯米飯配椰子。 與泰國的芒果糯米飯相比,這更加美味。 小心發音,以免說錯關鍵字。

點心

雜式點心

各式各樣的果凍狀點心提供甜鹹味的完美平衡。

椰子冰淇淋

椰子冰淇淋

在這張照片中,你會看到兩份冰淇淋,儘管我自己一個人在吃。單身的好處是?你兩份都可以拿到。

椰子

椰子

這是一顆普通的椰子,如果你從未見過的話。

椰子咖啡

椰子咖啡

我目前最喜歡的飲料是椰子咖啡。 它像星巴克的冰奶茶,但味道更好。

以及更多的越南咖啡

越南咖啡

当然,越南以其咖啡而著名,尤其是用炼乳制成的。 它和看起来一样好。

有用的短語學習

Rượu (喝醉)

第一個字是 "rượu",意思是喝醉。不論你是高興還是難過,"rượu" 都是解決一切的答案。

Một, Hai, Ba, YO (1, 2, 3, 乾杯)

第二個短語是 "Một, Hai, Ba, YO",意思是 "1, 2, 3, 乾杯"。這短語簡單易記,所以拿上你的啤酒,跟我一起說: "Một, Hai, Ba, YO!"

Product Strategy Recommendations

I recently completed an online course where I learned the frameworks and concepts needed to create compelling and memorable product experiences. I chose to apply this knowledge to Thought Machine, a company that has developed a core banking product. This product is revolutionizing the banking industry by empowering banks to offer innovative services to their customers.

Thought Machine has experienced substantial growth, recently raising $200 million in a Series C funding round. The round included industry-leading VCs and global banks such as Nyca Partners, Molten Ventures, JPMorgan, and Standard Chartered. This brings the company's total funding to $350 million, with a valuation exceeding £1 billion. Thought Machine collaborates with several banks, ranging from Tier 1 institutions to challengers, including Atom Bank, Curve, Lord Banking Group, Monese, SEB, Standard Chartered, TransferGo, Arvest, ING, and JP Morgan Chase.

The company's flagship product is Vault, a cloud-native ledger platform that operates on various cloud services, including Amazon Web Services, Google Cloud Platform, and Microsoft Azure. Thanks to smart contracts, Vault offers unparalleled flexibility, allowing it to run various retail banking products like current accounts, savings, loans, credit cards, and mortgages.

Levels of the Product

The best products offer more than just functionality; they meet a psychological need. Here's a breakdown:

Augmented Product

  • Real-time data transmission in and out of the ledger enables rich data streaming for AI and reporting.
  • Banks can manage everything from international currencies to cryptocurrencies and reward points.
  • A payments processing platform that allows any bank to manage payments across all methods, schemes, and regions worldwide.

Actual Product

  • A modern core banking platform in the cloud utilizing state-of-the-art infrastructure.
  • Product Library features a catalog of over 200 pre-configured financial products, such as current and savings accounts, loans, and mortgages.
  • The platform is designed with an API-first architecture, enabling banks to easily connect to services and technologies from other vendors.

Core Product

  • Reduces capital expenditure for on-premise hardware, significantly lowering a bank's carbon footprint.
  • Enables banks to innovate and introduce new financial services to their customers.
  • With Vault Core as the Universal Product Engine, banks can run any product they desire.

Competitive Landscape

Understanding the competition at various levels — narrow, form, need, and substitute — helps in positioning the product.

Narrow

  • Products nearly identical to yours, sharing both core and actual features.

  • Mambu

  • Temenos
  • Finastra

Form

  • Similar actual products competing for wallet share, but not perfect substitutes.

  • Microsoft Dynamics 365

  • 10x
  • FIS MBP

Need

  • Different actual products that meet the same core needs for the user.

  • Legacy systems, like IBM mainframes

Resource

  • Products competing for the same customer resources (i.e., money, time, attention).

  • Excel

  • Fiserv
  • Silverlake

Niche Evaluation

Evaluate the product's niche using Professor Alter's four criteria for a strong niche.

Size

  • The global retail core banking market is estimated to be worth just over £6 billion.

Identifiable

  • How to identify individuals with this need:

  • Newly established banks with digital banking licenses.

  • Banks needing to migrate legacy systems to a cloud platform.

Accessible

  • Channels for marketing to these individuals include:

  • Management consultants working in the banking sector.

  • Partners operating within the banking ecosystem.

Predictable Behaviors

  • Banks looking to compete and differentiate their financial products.

Friction Audit

Identify all customer touchpoints, pinpoint areas of friction, and suggest how Thought Machine could alleviate these issues.

Phase: Pre-Purchase Customer Touchpoint: Assessing Risk Friction: Banks are risk-averse and unwilling to change. Solution: Introduce Agile Methodology

Phase: Pre-Purchase Customer Touchpoint: Joining Demo Sessions Friction: Banks with outdated technological knowledge Solution: Introduce modern cloud architecture

Phase: Pre-Purchase Customer Touchpoint: Decision Making Friction: Bank decision-makers have close personal relationships with competitors. Solution: Build relationships with decision-makers

Phase: Purchase Customer Touchpoint: Signing Contract Friction: Complex software licensing documents Solution: Simplify and standardize the terms and conditions

Phase: Purchase Customer Touchpoint: Writing Statement of Work Friction: Banks may lack complete requirements. Solution: Engage early and provide a technical estimate

Phase: Purchase Customer Touchpoint: Team Formation Friction: Lack of talent within the bank for project participation Solution: Bring in system integration partners

Phase: Post-Purchase Customer Touchpoint: Product Installation Friction: Banks struggle with product installation Solution: Offer expert support

Phase: Post-Purchase Customer Touchpoint: Product Documentation Friction: Bank developers struggle to understand the product Solution: Provide enablement training and a self-service learning portal

Phase: Post-Purchase Customer Touchpoint: Production Support Friction: Need for platform stability Solution: Offer 24/7 product support and triage by severity levels

Product Innovation

The sweet spot for incremental innovation lies at the intersection of articulated needs for unserved customers and unarticulated needs for served customers. Here are some targeted customer needs:

  • Out-of-the-box core banking service
  • Flexibility with an easy-to-use configuration layer
  • Hosting options on both private and public cloud platforms
  • A scalable solution capable of high demand and throughput
  • Better integration with other banking systems
  • Enhanced reporting and data analytics
  • Data protection and privacy for personally identifiable information (PII)
  • Operational resilience to minimize production incidents

Priorities and Recommendations

To fully realize the potential of cloud core banking technology, we suggest the following initiatives:

  1. Focus on narrow niches, such as Shariah-compliant banking products, and leverage platform flexibility.
  2. Build an open-source integration library with other banking systems as augmented products.
  3. Conduct a friction audit to make the product easier to install and offer a SaaS solution that requires no installation effort.

For any questions regarding this proposal, feel free to get in touch with me on LinkedIn: https://linkedin.com/in/victorleungtw.

Product Strategy Recommendations

Hello and welcome to Continuous Improvement, the podcast where we explore strategies, frameworks, and concepts to create compelling and memorable product experiences. I'm your host, Victor Leung. Today, we are diving into an exciting blog post that discusses the innovative core banking product developed by Thought Machine. But before we delve into the details, let me tell you a bit about this groundbreaking company. Thought Machine has recently raised a staggering $200 million in a Series C funding round, attracting industry-leading VCs and global banks. With its flagship product, Vault, Thought Machine is revolutionizing the banking industry by empowering banks to offer innovative services to their customers.

Vault, a cloud-native ledger platform, operates on various cloud services, offering unparalleled flexibility to banks. From real-time data transmission and rich data streaming for AI and reporting, to managing international currencies, cryptocurrencies, and reward points, Vault provides an augmented product experience that goes beyond just functionality.

But let's take a step back and examine the different levels of this product. At the core is a modern banking platform, designed with an API-first architecture, allowing easy integration with services and technologies from other vendors. This is the actual product that meets the core needs of users. And finally, we have the augmented product, which includes features like a payments processing platform and the ability to run various retail banking products such as current accounts, savings, loans, credit cards, and mortgages.

Now, it's time to understand the competitive landscape. We will explore the competition at various levels, including narrow, form, need, and substitute. By understanding these levels, Thought Machine can better position its product and stand out from the competition.

Moving on to niche evaluation, we'll use Professor Alter's four criteria for a strong niche. The global retail core banking market is estimated to be worth just over £6 billion. To identify individuals with a need for this product, channels such as marketing and accessible platforms will come into play. And lastly, the predictable behaviors of banks looking to compete and differentiate their financial products will contribute to a strong niche.

Next up is the friction audit. We'll examine the different customer touchpoints and identify areas of friction that Thought Machine could alleviate. From the pre-purchase phase, which involves assessing risk, joining demo sessions, and decision making, to the purchase phase, which includes signing contracts, writing statements of work, and team formation, and finally, the post-purchase phase, which requires smooth product installation, clear product documentation, and reliable production support. By addressing these points of friction, Thought Machine can improve the overall customer experience.

Now, let's talk about product innovation. The sweet spot for incremental innovation lies in fulfilling articulated needs for unserved customers and unarticulated needs for served customers. Thought Machine is targeting customer needs that include out-of-the-box core banking services, flexibility with an easy-to-use configuration layer, hosting options on both private and public cloud platforms, scalability, enhanced integration with other banking systems, improved reporting and data analytics, data protection and privacy, and operational resilience.

Finally, in the blog post, the author provides priorities and recommendations for Thought Machine to fully realize the potential of their product. These include focusing on narrow niches and leveraging platform flexibility, building an open-source integration library, conducting a friction audit, and offering a SaaS solution.

That wraps up our exploration of this fascinating blog post on Thought Machine's revolutionary core banking product. I hope you found this discussion insightful. Thank you for joining me on this episode of Continuous Improvement. Remember, there's always room for improvement, so keep striving for excellence. Until next time!

產品策略建議

我最近完成了一個在線課程,學習了創建引人入勝且令人難忘的產品體驗所需的框架和概念。我選擇將這些知識應用到Thought Machine,一家開發了核心銀行產品的公司。這個產品正在革新銀行業,使得銀行能夠向他們的客戶提供創新的服務。

Thought Machine近年來增長迅速,最近在C輪融資中籌集了2億美元。該輪投資包括行業領先的風險投資公司和全球銀行,如Nyca Partners、Molten Ventures、JPMorgan、和 Standard Chartered。這使得該公司的總資金達到3.5億美元,估值超過10億英鎊。Thought Machine與幾家銀行合作,包括從一類金融機構到挑戰者的各種機構,包括Atom Bank、Curve、Lord銀行集團、Monese、SEB、Standard Chartered、TransferGo、Arvest、ING 和JP摩根追蹤。

該公司的旗艦產品是Vault,一個可以在各種雲服務上運行的原生雲賬本平台,包括亞馬遜網絡服務,Google Cloud平台和微軟Azure。憑藉智能合約,Vault提供了無與倫比的靈活性,使其可以運行各種零售銀行產品,如當前賬戶、儲蓄、貸款、信用卡和房屋貸款。

產品的層次

最好的產品不僅提供功能; 他們滿足了一種心理需求。以下是一種分解:

增值產品

  • 賬本內外的實時數據傳輸使AI和報告能夠具有豐富的數據流。
  • 銀行可以管理從國際貨幣到加密貨幣和獎勵積分的一切。
  • 一個支付處理平台,允許任何銀行在全球所有方法、計劃和地區中處理支付。

實際產品

  • 使用最先進的基礎設施在雲中的現代核心銀行平台。
  • 產品庫具有一個包含超過200種預先配置的金融產品的目錄,如當前和儲蓄帳戶,貸款,和房屋貸款。
  • 該平台採用了API優先的架構,使銀行可以輕鬆地連接到其他供應商的服務和技術。

核心產品

  • 減少了就地硬件的資本支出,顯著降低了銀行的碳足跡。
  • 使銀行能夠創新並向他們的客戶推出新的金融服務。
  • 使用Vault Core作為通用產品引擎,銀行可以運行他們想要的任何產品。

競爭格局

理解在各個層次上的競爭 — 狹窄,形式,需要,和替代 — 有助於定位產品。

狹窄

  • 與您的產品幾乎完全相同,既有核心功能也有實際功能。

  • Mambu

  • Temenos
  • Finastra

形式

  • 類似的實際產品在爭取錢包份額,但並非完美替代品。

  • 微軟動態365

  • 10x
  • FIS MBP

需要

  • 實際產品不同,但滿足用戶的相同核心需求。

  • 傳統系統,如IBM大型機

資源

  • 爭奪相同客戶資源(例如,金錢,時間,注意力)的產品。

  • Excel

  • Fiserv
  • Silverlake

利基評估

使用阿爾特教授對強大利基的四個標準來評估產品的利基。

尺寸

  • 全球零售核心銀行業務市場估值僅超過60億英鎊。

可識別

  • 如何識別有此需求的個體:

  • 新成立的銀行,擁有數字銀行許可。

  • 需要將傳統系統遷移到雲平台的銀行。

可訪問

  • 透過以下方式向這些個體進行市場推廣:

  • 在銀行業務中工作的管理顧問。

  • 在銀行生態系統內部運營的合作夥伴。

可預測的行為

  • 渴望競爭並區分其金融產品的銀行。

摩擦審查

識別所有客戶接觸點,找出摩擦區域,並建議Thought Machine如何紓解這些問題。

階段:購買前 客戶接觸點:風險評估 摩擦:銀行對風險規避,不願意改變。 解決方案:引入敏捷方法

階段:購買前 客戶接觸點:參加演示會議 摩擦:技術知識過時的銀行 解決方案:引入現代雲架構

階段:購買前 客戶接觸點:決策 摩擦:銀行決策者與競爭對手有密切的個人關係。 解決方案:與決策者建立關係

階段:購買 客戶接觸點:簽訂合同 摩擦:複雜的軟件許可文件 解決方案:簡化和標準化條款和條件

階段:購買 客戶接觸點:撰寫工作說明 摩擦:銀行可能缺乏完整的需求。 解決方案:提早參與並提供技術估算

階段:購買 客戶接觸點:團隊組成 摩擦:銀行中缺乏參與專案的人才 解決方案:引入系統整合合作夥伴

階段:購買後 客戶接觸點:產品安裝 摩擦:銀行在產品安裝上困苦 解決方案:提供專業支援

階段:購買後 客戶接觸點:產品文檔 摩擦:銀行開發者難以理解產品 解決方案:提供啟用培訓和自助學習門戶

階段:購買後 客戶接觸點:產品支持 摩擦:需要平台穩定性 解決方案:提供24/7產品支援並按危急程度分級處理

產品創新

在已服務客戶的陳述需求和未服務客戶的未陳述需求的交集處,存在著增量創新的甜蜜點。以下是一些目標客戶需求:

  • 即開即用的核心銀行業務服務
  • 靈活性強,配有易於使用的配置層
  • 既可以在私有雲平台上,也可以在公有雲平台上提供主機選擇
  • 可擴展的解決方案,能夠承受高需求和吞吐量
  • 更好地與其他銀行系統集成
  • 增強報告和數據分析
  • 保護個人識別信息(PII)的數據保護和隱私
  • 提高運營能力以減少生產事故

優先事項和建議

為了充分實現雲核心銀行技術的潛力,我們建議進行以下的行動:

  1. 專注於狹窄的利基市場,如遵循伊斯蘭教法的銀行產品,並利用平台靈活性。
  2. 作為增值產品,建立一個與其他銀行系統的開源集成庫。
  3. 進行摩擦審查,使產品更易於安裝,並提供無需安裝工作的SaaS解決方案。

對於此提案有任何問題,請隨時在LinkedIn上與我聯繫:https://linkedin.com/in/victorleungtw.

MongoDB Kafka Connector

Apache Kafka is an open-source publish/subscribe messaging system. Kafka Connect, a component of Apache Kafka, addresses the challenge of linking Apache Kafka with various datastores, including MongoDB. Kafka Connect offers:

  • A fault-tolerant runtime for transferring data to and from datastores
  • A framework that enables the Apache Kafka community to share solutions for connecting Apache Kafka to different datastores

In this post, we'll focus on using MongoDB as a data lake. The MongoDB Kafka sink connector is a Kafka Connect connector that reads data from Apache Kafka and writes it to MongoDB. The official MongoDB Kafka Connector can be found here.

Start the Kafka Environment

Download the latest Kafka version from here.

curl https://dlcdn.apache.org/kafka/3.2.0/kafka_2.13-3.2.0.tgz -o kafka_2.13-3.2.0.tgz
tar -xzf kafka_2.13-3.2.0.tgz
cd kafka_2.13-3.2.0

Run the following commands to start all the services in the correct order. Begin with the ZooKeeper service.

bin/zookeeper-server-start.sh config/zookeeper.properties

In another terminal session, start the Kafka broker service:

bin/kafka-server-start.sh config/server.properties

Once all the services have successfully launched, you will have a basic Kafka environment up and running.

Install the Plugin

Download the JAR file from here and navigate to the /libs directory.

curl -L https://search.maven.org/remotecontent?filepath=org/mongodb/kafka/mongo-kafka-connect/1.7.0/mongo-kafka-connect-1.7.0-all.jar -o plugin/mongo-kafka-connect-1.7.0-all.jar

Edit config/connect-standalone.properties and update the plugin.path to point to the downloaded JAR file.

plugin.path=/home/ubuntu/kafka_2.13-3.2.0/libs/mongo-kafka-connect-1.7.0-all.jar

Create Configuration Properties

In the /config folder, create a file named MongoSinkConnector.properties.

name=mongo-sink
topics=quickstart.sampleData
connector.class=com.mongodb.kafka.connect.MongoSinkConnector

Message Types

key.converter=org.apache.kafka.connect.json.JsonConverter
key.converter.schemas.enable=false
value.converter=org.apache.kafka.connect.json.JsonConverter
value.converter.schemas.enable=false

Specific MongoDB Sink Connector Configuration

connection.url=mongodb://localhost:27017
database=quickstart
collection=topicData
change.data.capture.handler=com.mongodb.kafka.connect.sink.cdc.mongodb.ChangeStreamHandler

In the /config folder, create a file named MongoSourceConnector.properties.

name=mongo-source
connector.class=com.mongodb.kafka.connect.MongoSourceConnector

Connection and Source Configuration

connection.uri=mongodb://localhost:27017
database=quickstart
collection=sampleData

Install MongoDB

Import the MongoDB public GPG Key by running the following command:

wget -qO - https://www.mongodb.org/static/pgp/server-5.0.asc | sudo apt-key add -

Create the MongoDB Source List

echo "deb [ arch=amd64,arm64 ] https://repo.mongodb.org/apt/ubuntu focal/mongodb-org/5.0 multiverse" | sudo tee /etc/apt/sources.list.d/mongodb-org-5.0.list

Update the Local Package Database

sudo apt-get update

Install MongoDB Packages

sudo apt-get install -y mongodb-org

If you encounter any errors related to unmet dependencies, fix them with the following commands:

echo "deb http://security.ubuntu.com/ubuntu impish-security main" | sudo tee /etc/apt/sources.list.d/impish-security.list
sudo apt-get update
sudo apt-get install libssl1.1

Verify MongoDB Status

Check that MongoDB has started successfully:

sudo systemctl status mongod

If it's inactive and needs to restart, run:

sudo systemctl restart mongod

Start Kafka Connect

To start Kafka Connect, execute the following command:

bin/connect-standalone.sh config/connect-standalone.properties config/MongoSourceConnector.properties config/MongoSinkConnector.properties

Write Data to the Topic

Run the console producer client to write a few events into your topic. Each line you enter will result in a separate event being written to the topic.

$ bin/kafka-console-producer.sh --topic connect-test --bootstrap-server localhost:9092
This is my first event
This is my second event

Send Document Contents Through Your Connectors

To send the contents of a document through your connectors, insert a document into the MongoDB collection from which your source connector reads data. Use the following MongoDB shell commands:

use quickstart
db.sampleData.insertOne({"hello":"world"})

After inserting the document, verify that your connectors have processed the change by checking the topicData collection.

db.topicData.find()

You should see output similar to the following:

[
  {
    "_id": ObjectId(...),
    "hello": "world",
    "travel": "MongoDB Kafka Connector"
  }
]

Reference

For more information, visit the MongoDB Kafka Connector documentation.

MongoDB Kafka Connector

Welcome back to another episode of "Continuous Improvement." I'm your host, Victor, and today we're going to dive into the world of Apache Kafka and its integration with MongoDB.

Apache Kafka is an open-source publish/subscribe messaging system that allows seamless communication between different data sources. One component of Kafka, known as Kafka Connect, provides a solution for connecting Kafka with various datastores, including MongoDB. In today's episode, we'll focus on using MongoDB as a data lake and explore the MongoDB Kafka sink connector.

But before we get into that, let's start by setting up our Kafka environment. First, you'll need to download the latest Kafka version from the official Apache Kafka website. Once downloaded, extract the files and navigate to the Kafka directory.

To start our Kafka environment, we need to run the ZooKeeper service. Open a terminal window, navigate to the Kafka directory, and execute the following command:

bin/zookeeper-server-start.sh config/zookeeper.properties

Now that the ZooKeeper service is up and running, let's start the Kafka broker service. Open another terminal window, navigate to the Kafka directory, and execute the following command:

bin/kafka-server-start.sh config/server.properties

Excellent! We now have a basic Kafka environment up and running. Now let's install the MongoDB Kafka sink connector, which allows us to write data from Kafka to MongoDB.

First, let's download the required JAR file for the MongoDB Kafka Connector. Visit the official MongoDB Kafka Connector repository and download the JAR file. Once downloaded, navigate to the /libs directory within your Kafka installation.

Now, let's update the config/connect-standalone.properties file to include the plugin's path. Open the file, scroll to the bottom, and update the plugin.path property to point to the downloaded JAR file.

With the plugin installed, it's time to create the configuration properties for our MongoDB sink connector. In the /config folder, create a file named MongoSinkConnector.properties. This file will contain the necessary properties for our MongoDB sink connector to function.

Now, let's add the required properties for the message types. We'll use the JSON converter for both the key and value and disable schemas.

Onto the specific MongoDB sink connector configuration. Here, we define the connection URL, the database we want to write to, the collection within the database, and the change data capture handler.

Great! Now let's create another configuration file for the MongoDB source connector. Create a file in the /config folder named MongoSourceConnector.properties. This file will contain the necessary properties for our MongoDB source connector.

In the MongoSourceConnector.properties file, we need to specify the connection URI of our MongoDB instance, the database we'll be reading from, and the collection within that database.

Now that we have our Kafka environment set up and the MongoDB Kafka connectors configured, it's time to install MongoDB itself. We'll go through the installation steps quickly, but keep in mind that you may need to adjust some commands based on your operating system.

First, we'll need to download the MongoDB public GPG key and add it to our system. This step ensures the authenticity of the MongoDB packages.

Next, we create the MongoDB source list, which specifies the MongoDB packages' download location.

After updating the package database with the MongoDB source list, we can finally install the MongoDB packages.

In case you encounter any errors related to unmet dependencies during the installation, we provided some commands to address those issues.

Finally, let's verify the status of our MongoDB installation to ensure everything is running smoothly. Simply run the command and check the output to see if MongoDB has started successfully.

Perfect! Now that we have our Kafka environment set up, the MongoDB Kafka connectors configured, and MongoDB installed, we're ready to start the Kafka Connect service.

To start Kafka Connect, open a terminal window, navigate to the Kafka directory, and execute the following command:

bin/connect-standalone.sh config/connect-standalone.properties config/MongoSourceConnector.properties config/MongoSinkConnector.properties

With Kafka Connect up and running, let's write some data to our Kafka topic. Open a new terminal window, navigate to the Kafka directory, and execute the command provided.

Fantastic! We've successfully written data to our Kafka topic. Now, let's ensure that our MongoDB sink connector is properly processing the data and writing it to the MongoDB collection.

To verify this, we'll insert a document into the MongoDB collection from which our source connector reads data. Execute the MongoDB shell commands provided, and the document will be inserted.

Finally, let's check the topicData collection in MongoDB to confirm that our connectors have successfully processed the change.

Congratulations! You've successfully integrated Apache Kafka with MongoDB, allowing seamless data transfer between the two systems. For more information and further details, visit the MongoDB Kafka Connector documentation linked in the show notes.

That's it for today's episode of "Continuous Improvement." I hope you found this exploration of Apache Kafka and MongoDB valuable. Stay tuned for more episodes where we uncover the best practices and tools for continuous improvement in the tech world. Until then, keep improving!

MongoDB Kafka 連接器

Apache Kafka 是一種開源的發布/訂閱消息系統。Kafka Connect,Apache Kafka的一個元件,面對將Apache Kafka與各種數據存儲連接的挑戰,包括 MongoDB。Kafka Connect 提供:

  • 傳輸數據到數據存儲的容錯運行時
  • Apache Kafka社區共享連接 Apache Kafka 到不同數據存儲解決方案的框架

在這篇文章中,我們將重點介紹如何將 MongoDB 作為數據湖。 MongoDB Kafka 接收連接器是從 Apache Kafka 讀取數據並將其寫入 MongoDB 的 Kafka Connect 連接器。官方的 MongoDB Kafka 連接器可以在這裏找到。

開始 Kafka 環境

這裡下載最新版的 Kafka。

curl https://dlcdn.apache.org/kafka/3.2.0/kafka_2.13-3.2.0.tgz -o kafka_2.13-3.2.0.tgz
tar -xzf kafka_2.13-3.2.0.tgz
cd kafka_2.13-3.2.0

按照正確的順序運行以下命令來開始所有的服務。首先開始 ZooKeeper 服務。

bin/zookeeper-server-start.sh config/zookeeper.properties

在另一個終端會話中,開始 Kafka 代理服務:

bin/kafka-server-start.sh config/server.properties

所有服務成功啟動後,您將會擁有一個運行中的 Kafka 基礎環境。

安裝插件

這裡下載 JAR 文件,並導航至 /libs 目錄。

curl -L https://search.maven.org/remotecontent?filepath=org/mongodb/kafka/mongo-kafka-connect/1.7.0/mongo-kafka-connect-1.7.0-all.jar -o plugin/mongo-kafka-connect-1.7.0-all.jar

編輯 config/connect-standalone.properties,並將 plugin.path 指向下載的 JAR 文件。

plugin.path=/home/ubuntu/kafka_2.13-3.2.0/libs/mongo-kafka-connect-1.7.0-all.jar

創建配置屬性

/config 文件夾中,創建一個名為 MongoSinkConnector.properties 的文件。

name=mongo-sink
topics=quickstart.sampleData
connector.class=com.mongodb.kafka.connect.MongoSinkConnector

消息類型

key.converter=org.apache.kafka.connect.json.JsonConverter
key.converter.schemas.enable=false
value.converter=org.apache.kafka.connect.json.JsonConverter
value.converter.schemas.enable=false

關於 MongoDB Sink 連接器的具體配置

connection.url=mongodb://localhost:27017
database=quickstart
collection=topicData
change.data.capture.handler=com.mongodb.kafka.connect.sink.cdc.mongodb.ChangeStreamHandler

/config 文件夾中,創建一個名為 MongoSourceConnector.properties 的文件。

name=mongo-source
connector.class=com.mongodb.kafka.connect.MongoSourceConnector

連接和源配置

connection.uri=mongodb://localhost:27017
database=quickstart
collection=sampleData

安裝 MongoDB

運行以下命令導入 MongoDB 的公開 GPG 鑰匙:

wget -qO - https://www.mongodb.org/static/pgp/server-5.0.asc | sudo apt-key add -

創建 MongoDB 源列表

echo "deb [ arch=amd64,arm64 ] https://repo.mongodb.org/apt/ubuntu focal/mongodb-org/5.0 multiverse" | sudo tee /etc/apt/sources.list.d/mongodb-org-5.0.list

更新本地軟件包數據庫

sudo apt-get update

安裝 MongoDB 套件

sudo apt-get install -y mongodb-org

如果遇到任何與未滿足的依賴項相關的錯誤,使用以下命令修復它們:

echo "deb http://security.ubuntu.com/ubuntu impish-security main" | sudo tee /etc/apt/sources.list.d/impish-security.list
sudo apt-get update
sudo apt-get install libssl1.1

驗證 MongoDB 狀態

檢查 MongoDB 是否已成功啟動:

sudo systemctl status mongod

如果它是非活動的並需要重新啟動,運行:

sudo systemctl restart mongod

開始 Kafka Connect

要開始 Kafka Connect,執行以下命令:

bin/connect-standalone.sh config/connect-standalone.properties config/MongoSourceConnector.properties config/MongoSinkConnector.properties

將數據寫入 Topic

運行控制台生產者客戶端以將幾個事件寫入您的 Topic。您輸入的每行將導致一個單獨的事件被寫入 Topic。

$ bin/kafka-console-producer.sh --topic connect-test --bootstrap-server localhost:9092
This is my first event
This is my second event

通過您的連接器發送文件內容

要將文檔的內容通過您的連接器發送,插入一個文檔到您的源連接器從中讀取數據的 MongoDB 集合。使用以下 MongoDB shell 命令:

use quickstart
db.sampleData.insertOne({"hello":"world"})

插入文檔後,通過檢查 topicData 集合來驗證您的連接器是否已將變更處理。

db.topicData.find()

您應該會看到以下類似的輸出:

[
  {
    "_id": ObjectId(...),
    "hello": "world",
    "travel": "MongoDB Kafka Connector"
  }
]

參考

欲了解更多訊息,請參觀 MongoDB Kafka 連接器文檔

FinTech Security and Regulation Suggestions

I'd like to offer suggestions for how authorities should handle the application of Virtual Banking in Singapore's financial industry. Given the highly regulated nature of banking, the relationship between Virtual Banking innovation and regulation is often tense. There is a universal understanding that regulatory organizations are necessary to mitigate the risks and unanticipated consequences associated with new business models and financial products. My advice to regulators is to keep pace with the rapid changes in the fintech industry.

Virtual banks have posed new questions for the supervisory organizations that regulate how market players operate. This has led authorities to carefully assess the risks associated with emerging technologies in the financial services industry. While cloud technologies offer unprecedented potential, they also present new risks.

Four major motivations for regulation should be considered: uncertainty, resource conflict, disruption and unforeseen events, and public benefit. The adoption of cloud technologies will fundamentally change how the financial system operates, necessitating safeguards to prevent system collapse due to unforeseen events.

More specifically, precautions should be taken to protect virtual banking consumers from the drawbacks of a completely market-driven system. Monitoring within the fintech sector offers numerous benefits, but implementing effective regulation presents significant challenges.

Many market players may view regulation skeptically, believing it could hinder their prospects or operations. Therefore, implementing insightful regulation for the fintech industry won't be straightforward. Regulatory authorities could consider three approaches to fintech innovation:

  1. Rule-Based System: The regulatory authority sets strict rules and processes that market participants must adhere to.

  2. Principles-Based System: The regulatory body provides principles to guide market players, allowing them some freedom in achieving their regulatory responsibilities.

  3. Performance-Based System: The regulatory body sets specific benchmarks for market participants to meet or exceed.

These approaches could help Singapore's fintech industry flourish. Careful regulation cultivates an ideal environment for innovation, building trust and fostering the widespread acceptance of new consumer goods and services. The Monetary Authority of Singapore (MAS) aims to make Singapore an "experimental center" for fintech innovation, in line with its long-term goal to attract fintech innovators to the Asia-Pacific region.

In an ideal world, one wouldn't have to choose between innovation and regulation. Virtual banks can leverage innovative technologies to streamline regulatory compliance. The burgeoning regulatory technology (reg-tech) industry offers software solutions that help regulators perform their duties more efficiently.

With shifts in the regulatory landscape, both existing and future virtual banks need to prepare for changes in daily regulatory operations. Establishing an open, respectful working relationship between policymakers and stakeholders in the fintech field will be critical for the effective adoption of virtual banking.

In July 2016, the MAS amended its Guidelines on Outsourcing for Financial Institutions (FIs) to acknowledge that FIs could benefit from cloud services. These guidelines require FIs to conduct due diligence and employ robust governance and risk management processes when using cloud services.

Cloud security environments should be regularly reviewed, and services should comply with various industry certifications. For example, ISO 27001 outlines best practices for security management, while ISO 27017 and ISO 27018 provide cloud-specific security recommendations. Additionally, MTCS Level 3 and PCI DSS Level 1 offer further security standards specific to Singapore and payment card industries, respectively.

By combining governance-focused, audit-friendly features with certifications and audit standards, regulators can ensure a secure control environment for cloud providers.

The MAS Guidelines provide recommendations for risk management techniques, including due diligence and risk assessment for cloud services. Financial institutions are expected to follow these guidelines and report their compliance to MAS annually or upon request.

The MAS Technology Risk Management (TRM) Guidelines and the Association of Banks in Singapore (ABS) Cloud Computing Implementation Guide provide additional guidance on risk management, governance, and controls for cloud outsourcing.

In conclusion, each virtual bank's path to cloud adoption is unique. Virtual banks need to understand their current state, desired state, and the steps required to transition from one to the other for successful cloud implementation. This understanding will assist virtual banks in setting goals and developing workstreams for successful cloud migration.

FinTech Security and Regulation Suggestions

Welcome to Continuous Improvement, the podcast where we explore strategies and insights for enhancing various industries through continuous improvement. I'm your host, Victor, and today we'll be diving into the fascinating world of Virtual Banking in Singapore's financial industry.

Virtual banking has undoubtedly reshaped the way we think about financial services. However, with innovation comes the need for effective regulation to manage risks and ensure the smooth functioning of the market. In today's episode, we'll be discussing the delicate balance between Virtual Banking innovation and regulation, and I'll be sharing some valuable advice for regulators in Singapore.

But before we dive in, let's understand the motivations behind regulation in the fintech industry. Uncertainty, resource conflict, disruption, and unforeseen events are some of the key drivers that lead regulators to assess the risks associated with emerging technologies.

Now, the adoption of cloud technologies has certainly revolutionized the financial system, offering unprecedented potential. However, it also brings new risks that require safeguards to prevent system collapse. This is where insightful regulation plays a crucial role.

When it comes to regulating the fintech industry, regulators can consider three approaches: rule-based, principles-based, and performance-based systems.

In a rule-based system, strict rules and processes are set by the regulatory authority, leaving little room for interpretation. On the other hand, a principles-based system provides guiding principles for market players, allowing them some freedom in achieving their regulatory responsibilities. Lastly, a performance-based system sets specific benchmarks for market participants to meet or exceed.

Now, each approach has its own benefits and challenges, but finding the right balance is vital for Singapore's fintech industry to thrive.

The Monetary Authority of Singapore, also known as MAS, aims to position Singapore as an experimental center for fintech innovation. Their long-term goal is to attract fintech innovators to the Asia-Pacific region. To achieve this, MAS has embraced the use of regulatory technology, or reg-tech, to streamline compliance processes and foster a conducive environment for innovation.

But what about virtual banks themselves? How can they navigate the evolving regulatory landscape? It all starts with establishing an open and respectful relationship between policymakers and stakeholders in the fintech field.

MAS has already taken steps to address this by amending its Guidelines on Outsourcing for Financial Institutions. These guidelines acknowledge that virtual banks can benefit from cloud services. However, they also require due diligence, robust governance, and risk management processes to be in place when utilizing cloud services.

Cloud security is of utmost importance, and regular reviews of cloud security environments should be conducted. Compliance with industry certifications such as ISO 27001, ISO 27017, ISO 27018, MTCS Level 3, and PCI DSS Level 1 ensures the highest level of security standards.

Additionally, MAS provides guidance on risk management techniques and expects financial institutions to comply with these guidelines and report their compliance accordingly.

In conclusion, successful cloud implementation for virtual banks requires a deep understanding of their current and desired states. Proper goal-setting and the development of workstreams specific to cloud migration are crucial.

By embracing innovation while maintaining effective regulation, Singapore can become a hub for virtual banking and secure its position as a fintech powerhouse in the Asia-Pacific region.

That's all for today's episode of Continuous Improvement. I hope you gained valuable insights into the relationship between Virtual Banking and regulation in Singapore's financial industry. Stay tuned for future episodes where we explore more strategies for enhancing various industries through continuous improvement.

Thank you for listening, and until next time, I'm Victor signing off.

金融科技安全與監管建議

我想提供建議,指導當局如何處理新加坡金融業的虛擬銀行應用。由於銀行業高度監管的性質,虛擬銀行創新與監管之間的關係經常處於緊張狀態。大家普遍認同,監管機構是必要的,可以減少與新的商業模式和金融產品相關的風險和未預見的後果。我建議監管機構與金融科技行業的快速變化保持步調。

虛擬銀行給監管市場參與者運營方式的監管機構帶來了新的問題。這使得監管機構必須仔細評估金融服務行業新興技術帶來的風險。雖然雲技術提供了前所未有的可能性,但也帶來了新的風險。

應考慮四大監管的動機:不確定性,資源衝突,破壞性和未預見的事件,以及公眾利益。採用雲技術將徹底改變金融系統的運營方式,需要采取保護措施防止因未預見的事件導致的系統崩潰。

更具體地說,應該採取預防性措施,保護虛擬銀行消費者免受完全市場驅動系統的弊端。在金融科技部門內進行監控有許多好處,但實施有效的監管仍然面臨重大挑戰。

許多市場參與者可能對監管持懷疑態度,認為這可能阻礙他們的前景或運營。因此,為金融科技行業實施具有洞察力的監管並不會一帆風順。監管機構可以考慮三種對金融科技創新的方法:

  1. 基於規則的系統:監管機構設定嚴格的規則和程序,市場參與者必須遵守。

  2. 基於原則的系統:監管機構提供指導市場參與者的原則,允許他們在履行其監管責任時有一定的自由。

  3. 基於績效的系統:監管機構為市場參與者設定特定的標準,市場參與者需要達到或超越這些標準。

這些方法可以幫助新加坡金融科技行業蓬勃發展。謹慎的監管培育了創新的理想環境,建立信任和促進新的消費品和服務的普及接受。新加坡金融管理局(MAS)的目標是使新加坡成為金融科技創新的“實驗中心”,符合其長期吸引亞太地區金融科技創新者的目標。

在理想狀態下,人們不必在創新和監管之間做出選擇。虛擬銀行可以利用創新技術來簡化監管合規。新興的監管科技(reg-tech)行業提供軟件解決方案,幫助監管機構更高效地行使職責。

隨著監管風景的變化,現有和未來的虛擬銀行需要為日常監管業務的變更做好準備。在政策制定者和金融科技領域的相關人士之間建立開放,尊重的工作關係對於有效採用虛擬銀行至關重要。

在2016年7月,MAS修改了其對金融機構(FI)外包的指導方針,以認識到金融機構可能從雲服務中受益。這些指導方針要求金融機構在使用雲服務時進行盡職調查並實施強健的管治和風險管理流程。

雲安全環境應定期審查,服務應遵從多種行業認證。例如,ISO 27001概述了安全管理的最佳實踐,而ISO 27017和ISO 27018提供了針對雲的具體安全建議。此外,MTCS Level 3和PCI DSS Level 1為新加坡和支付卡行業提供了更進一步的安全標準。

通過將以治理為中心,適合審核的功能與認證和審核標準結合,監管機構可以確保雲供應商的安全控制環境。

MAS指導方針提供了風險管理技巧的建議,包括對雲服務的盡職調查和風險評估。預計金融機構將遵循這些指導方針,並每年或按要求向MAS報告其合規情況。

MAS的技術風險管理(TRM)指導方針和新加坡銀行協會(ABS)雲計算實施指南為雲外包的風險管理,管治和控制提供了額外的指導。

總的來說,每家虛擬銀行走向雲技術的道路都是獨特的。虛擬銀行需要理解他們的現狀,期望狀態,以及從一種狀態過渡到另一種狀態所需的步驟,以便成功實施雲技術。這種理解將幫助虛擬銀行設定目標並開發工作流程,以實現成功的雲遷移。