To learn about Penetration Testing:
http://www.securitytube.net
https://www.offensive-security.com
@ogiebulakeno
Tuesday, April 24, 2018
Wednesday, September 14, 2016
AWS Cloudwatch to S3
I've been researching on how to move Cloudwatch logs to S3 Bucket for archiving purposes. Cloudwatch logs can be manually exported as one of its Action menu. Maybe in the future it can be one of its feature to automate but right now I need to create a solution to automate it.
Good thing there is an AWS CLI we can use to automate it. First you need to install the python AWS plugin in able to use it.
https://pypi.python.org/pypi/awscli-cwlogs/1.4.0
Once installed, in may case, I use a linux box for my automation tool (e.g. Puppet, git, ansible, etc.). I just created a cron job that will export the logs to S3 bucket.
0 0 * * * /usr/local/bin/aws logs create-export-task --task-name "LogExport1" --log-group-name "Windows" --destination "prod-os-logs1" --destination-prefix "WindowsLogs/$(date)" --from "$(($(date +\%s\%3N) - 86400000))" --to "$(date +\%s\%3N)"
Where "Windows" is Cloudwatch log group and "prod-os-logs1" is S3 Bucket
I've been researching on how to move Cloudwatch logs to S3 Bucket for archiving purposes. Cloudwatch logs can be manually exported as one of its Action menu. Maybe in the future it can be one of its feature to automate but right now I need to create a solution to automate it.
Good thing there is an AWS CLI we can use to automate it. First you need to install the python AWS plugin in able to use it.
https://pypi.python.org/pypi/awscli-cwlogs/1.4.0
Once installed, in may case, I use a linux box for my automation tool (e.g. Puppet, git, ansible, etc.). I just created a cron job that will export the logs to S3 bucket.
0 0 * * * /usr/local/bin/aws logs create-export-task --task-name "LogExport1" --log-group-name "Windows" --destination "prod-os-logs1" --destination-prefix "WindowsLogs/$(date)" --from "$(($(date +\%s\%3N) - 86400000))" --to "$(date +\%s\%3N)"
Where "Windows" is Cloudwatch log group and "prod-os-logs1" is S3 Bucket
Saturday, July 9, 2016
Tuesday, March 29, 2016
Tuesday, March 8, 2016
IETF best current practices (BCP) (Link/s)
For Reference:
https://www.ietf.org/rfc/bcp-index.txt
https://en.wikipedia.org/wiki/Best_current_practice
https://www.ietf.org/rfc/bcp-index.txt
https://en.wikipedia.org/wiki/Best_current_practice
Thursday, November 20, 2014
Big Data
I'm currently learning what this Big Data specifically using Apache's Hadoop is all about and I can say its a a Big WOW just by imagining how much data it can process. If you're an IT guys like me that has Database administration background using MS-SQL, MySQL, Oracle, etc, you would ask how those all the data (syslog, weblog, sales, etc.) can be imported to a database and make use of it or produce information that can help the business or customers.
Big Data can use or import using Pig (can eat anything) or if you're not much of a java programmer, a Hive can be use to create a MapReduce jobs. There are also tools like Sqoop and Flume that can help importing file or streaming data.
Good thing also about Hadoop is that the License is under Apache, which means no one OWNS it and the public community can use it, for Free.
Link:
http://hadoop.apache.org
Big Data can use or import using Pig (can eat anything) or if you're not much of a java programmer, a Hive can be use to create a MapReduce jobs. There are also tools like Sqoop and Flume that can help importing file or streaming data.
Good thing also about Hadoop is that the License is under Apache, which means no one OWNS it and the public community can use it, for Free.
Link:
http://hadoop.apache.org
Monday, September 8, 2014
Python 101
With SDN on the rise, network engineer should know how to use either Java or Python in able to manage the Controller to talk to the API.
Google has a basic Python class available to begin with:
https://developers.google.com/edu/python/
Happy Scripting!!!
UPDATE: Playing with Python..
With the recent Earthquake in the Philippines last January 11, 2015 at around 3:30AM (Phil. Time) that reach 5.9 Magnitude, I though maybe I can create an image (Data Visualisation) on where is the centre of the earthquake. So here's python to the rescue. Since I been reading for awhile regarding what other modules can do, I ended up using Basemap. The script below has 2 parts. First is to grab dataset from the cvs file. The columns are Latitude, Longitude and Magnitude. The second part is the engine for the script where Baseman is being used to create the image.
For the result, the red dot will got bigger if the magnitude is higher.
----------------------------------------------------------------
Google has a basic Python class available to begin with:
https://developers.google.com/edu/python/
Happy Scripting!!!
UPDATE: Playing with Python..
With the recent Earthquake in the Philippines last January 11, 2015 at around 3:30AM (Phil. Time) that reach 5.9 Magnitude, I though maybe I can create an image (Data Visualisation) on where is the centre of the earthquake. So here's python to the rescue. Since I been reading for awhile regarding what other modules can do, I ended up using Basemap. The script below has 2 parts. First is to grab dataset from the cvs file. The columns are Latitude, Longitude and Magnitude. The second part is the engine for the script where Baseman is being used to create the image.
For the result, the red dot will got bigger if the magnitude is higher.
----------------------------------------------------------------
# Import Dataset
import csv
filename ='DS.csv'
lats, lons = [], []
mags = []
with open(filename) as f:
# Create a csv reader object.
reader = csv.reader(f)
# Ignore the header row.
#next(reader)
# Store the latitudes and longitudes in the appropriate lists.
for row in reader:
lats.append(float(row[0]))
lons.append(float(row[1]))
mags.append(float(row[2]))
#----------------------------------------------------------------
from mpl_toolkits.basemap import Basemap
import matplotlib.pyplot as plt
import numpy as np
## Philippine Map coordinates
m = Basemap(resolution='f',projection='merc',
lat_0=13, lon_0=122,
llcrnrlat=5.0,
urcrnrlat=19.0,
llcrnrlon=114.,
urcrnrlon=130.0,
)
m.drawmapboundary(fill_color='white')
m.fillcontinents(color='#F5DEB3',lake_color='#85A6D9')
m.drawcoastlines(color='black', linewidth=.4)
m.drawcountries(color='#6D5F47', linewidth=.4)
min_marker_size= 2.5
for lon, lat, mag in zip(lons, lats, mags):
x,y = m(lon, lat)
msize = mag * min_marker_size
m.plot(x,y, 'ro', markersize=msize)
figsize=(50, 18)
plt.show()
DS.csv
09.97,124.17,2.6
11.64,126.10,3.2
11.55,126.31,3.1
11.59,126.23,4.9
04.87,127.19,3.7
06.17,126.02,2.5
14.79,120.00,2.3
05.69,126.26,4.2
14.74,119.91,5.9
Final image:
Subscribe to:
Posts (Atom)